Salesforce Github Actions - github

can we configure our github actions yml file such that whenever we make commit to our repository, it automatically prepares the empty package.xml file with the components commited and deployment begins to the target org?
I know how its done when we have packaged.xml file along with the componets in it,but here the package.xml file should be empty and whenever we commit our changes to the repository, yml automatically prepares the components to be deployed based on the commits in the empty package.xml file and then finally deploys to the org

For triggering event you'll need to define "on" in the yml. You can start with what's in https://github.com/trailheadapps/lwc-recipes/blob/main/.github/workflows/ci.yml - on any commit/pull request to main branch unless it's just a readme change. And allows for manual triggering too
on:
workflow_dispatch:
push:
branches:
- main
paths-ignore:
- 'sfdx-project.json'
- 'README.md'
As for actual commands...
What's your Github repository's format? Old school metadata api format (with package.xml, Account.object containing dozens of fields, listviews, validation rules) or new source tracking format (Account is a folder, every single field gets its own small xml file, most important directory is probably "force-app/main/default")?
You should be able to call sfdx force:source:convert -d mdapi in your github action to create a temp directory called "mdapi". If you're making a managed package read up about the "-n" option. It will contain your changes but converted from source to mdapi format.
There are things it will not do that a hand-crafted package.xml would (description, post install class) - but again, these tend matter when you make managed packages, for normal usage you should be fine.
After convert try sfdx force:mdapi:deploy -d mdapi -l RunLocalTests -w -1 -c (metadata format deploy, which directory, which tests, wait as long as it's needed, just validate, don't really deploy)
There are sfdx plugins to make it smarter (deploy a delta between 2 commits, not a full project every time). https://wiki.sfxd.org/books/sfdc-tools/page/notable-sfdx-plugins-and-resources

Related

Git actions, ignoring specific folders on .yml file

im new using git actions.
So im using actions from git to automate deployment from my repository to my sftp server, everything works fine, but when the action is on execution takes to much time, around 20 minutes, on my repo exist files like system, application, and dist so, that kind of files i don't want to re-upload.
Doing a little research I found out that they can be ignored certains paths.
I found out that they can be ignored with "paths-ignore" but for some reason it's not working, this is my file.
I want to ignore all the content inside application/cache, application/config, application/core .. etc.Or all the folder "application/cache", "application/config".. etc
What am I doing wrong? It's possible to do that?
First I have a suggestion:
Don't put all these files inside your repository, git is for versioning code and not for storage.
paths-ignore is a config for on: event, which means when that on you have changes in your repository, except changes in paths-ignore, start the workflow.
There is no way to reduce too much time in your workflow because every job creates a new machine that gets all your git repository and starts some workflow. But you can read more about the SFTP-Deploy-Action and use the parameter local-path to upload just what you want, maybe your workflow drop to 10 minutes.

Trigger to run TeamCity based on file existence

I want to run TeamCity processes based on file existance.
I have two TeamCity processes (Dev and Prod):
Dev should be run if there is DevParam file in repo (or in specified location).
Prod should be run if there is ProdParam file.
I want to run exactly one process after each push to repository.
This files will be added and removed like:
[0] Repository has DevParam file
[1] Pushed, there is still DevParam file -> Dev process should be run
[2] Pushed, removed DevParam file and added ProdParam -> Prod process should be run
[3] Pushed, there is still ProdParam -> Prod should be run
I tried to create Trigger with rules, but I failed (rule like +:DevParam run also on file removal).
Git recognizes addind and removing this files as Moving with Rename, so it may be relevant.
file management is not a normal process. I strongly advise you to use the branch flow. For your example, use develop branch(DevParam) for an all your developers and master branch for a prod
Try to use the follows advice.
The developers are coding in the dev branch. Each developer working only this branch.
You should create the build configuration with the trigger to your dev branch. After each new commit, the configuration will be triggered.
If you decided that the code in the dev branch is ready to production, you just merged all to the master. And now you can also trigger the same configuration only for a master branch.
For more information about gitflow-workflow read this

Only run AppVeyor build on certain conditions

I have a project on AppVeyor that I want to build in two (or three) different scenarios:
Anytime it's on the master branch no matter which files changed, but not a pull request
If it is a pull request on the master branch when certain files have changed
(Maybe if it's on a different branch when certain files have changed)
Is there a way to configure appveyor.yml to do this? I'm aware of how to use APPVEYOR_PULL_REQUEST_NUMBER in one-liners, but I want to be able to apply it to the entire appveyor.yml, and combine it with the only_commits: options.
You can create 2 projects on AppVeyor connected to the same GitHub repo:
For the first project got to GitHub repo settings, find Webhook connected to this project and uncheck pull request
For the second project set only_commits for those certain files according to commit filtering doc
Optionally, for the first project set skip_commits for the same files according the same commit filtering doc, to eliminate duplicate builds.

VCS capable of files versioned per user

I have already used some VCS like CVS, SVN and Git. One feature that I am missing cannot be found anywhere.
There are files which I would like to have in the repository but every user should have its own. So when you checkout you get a default of that file and that commit your changes only for yourself.
Why do I want this? There are some files like configuration where I would like to have a default version in the repository (e.g. for building releases or a starting base for new team members) but the changes to that file are only relevant to a certain developer (or working copy) because it will contain paths only valid for that developer/working copy.
Currently when I do not add this files:
- I miss them when creating a new working copy or exporting for a release build
- Have no history which changes I might have done for myself for experimenting
Currently when I add this files to the repository:
- I might never commit them so I have a default in the repository but my file is always flagged "changed". In SVN I can add it to the "ignore-on-commit" changelist to improve a bit.
- I might loose my very own changes of a difficult configuration file (data crash, laptop theft, etc.)
Is there a VCS capable of this? Do SNV or git support something regarding this I might have overseen?
If I understood and decomposed your task correctly
"Have a set of default templates of something, which are starting point of per-user customization and these customized versions must be stored separately and be accessible only by responsible person"
you can use this workflow (draft, subject of modifications and corrections), Subversion based for simplicity and transparent management (strong point of any CVCS really)
Subversion repository
Each user of repo have own predefined path inside repository-tree (with common path-pattern for manageability and easy automation of processes)
One special admin-only managed path also exist, not accessible by ordinary users
Our tree may seems like this (where Repository dir is a root of repository)
z:\>dir /s /B
z:\Repository
z:\Repository\Users
z:\Repository\Template
z:\Repository\Users\Alpha
z:\Repository\Users\Bravo
For every user-path we use Path-based Authentication, which provides access for every and each user only to own subtree in repository,
Template contains (as name assumes) templates stub for all user's documents
Adding new users to repo, obviously, becomes simple and easy automated task:
svn copy Template into new user's dir
add rw permissions for created location for user in authz-file
tell user URL of his personal tree in repo
I don't think the VCS is the problem here. It looks like if you have a file whose contents are dependent on the local environment, you should auto-generate it with a script. This way, you ignore the generated file, but version the script and each developer still gets a perfectly valid copy of the config files at run time. This is the same approach that is used, for example, with user specific IDE settings: .suo files on Visual Studio for example.
Update:
If you specifically need a set of defaults, then the solution is this:
Add the defaults to the repository.
Each dev works in their own branch. This way, they can version the
changes to the config files.
When re-basing onto master and/or merging, the devs simply never
merge their customized configs.
You can always set up a hook to check if the default config has been modified, and if so, maybe email the dev. You simply view such a commit the same as you would view a commit that does not compile.
Devs are smart. Sure, they make mistakes. But never under-estimate the power of some simple communication.
Of course, when the default configs do get overwritten with the customized ones of Dev X, then you use the powers of git to fix that commit immediately.

Jenkins: FTP / SSH deployment, including deletion and moving of files

I was wondering how to get my web-projects deployed using ftp and/or ssh.
We currently have a self-made deployment system which is able to handle this, but I want to switch to Jenkins.
I know there are publishing plugins and they work well when it comes to uploading build artifacts. But they can't delete or move files.
Do you have any hints, tipps or ideas regarding my problem?
The Publish Over SSH plugin enables you to send commands using ssh to the remote server. This works very well, we also perform some moving/deleting files before deploying the new version, and had no problems whatsoever using this approach.
The easiest way to handle deleting and moving items is by deleting everything on the server before you deploy a new release using one of the 'Publish over' extensions. I'd say that really is the only way to know the deployed version is the one you want. If you want more versioning-system style behavior you either need to use a versioning system or maybe rsync that will cover part of it.
If your demands are very specific you could develop your own convention to mark deletions and have them be performed by a separate script (like you would for database changes using Liquibase or something like that).
By the way: I would recommend not automatically updating your live sites after every build using the 'publish over ...' extension. In case we really want to have a live site automatically updated we rely on the Promoted Builds Plugin to keep it nearly fully-automated but add a little safety.
I came up with a simple solution to remove deleted files and upload changes to a remote FTP server as a build action in Jenkins using a simple lftp mirror script. Lftp Manual Page
In Short, you create a config file in your jenkins user directory ~/.netrc and populate it with your FTP credentials.
machine ftp.remote-host.com
login mySuperSweetUsername
password mySuperSweetPassword
Create an lftp script deploy.lftp and drop it in the root of your .git repo
set ftp:list-options -a
set cmd:fail-exit true
open ftp.remote-host.com
mirror --reverse --verbose --delete --exclude .git/ --exclude deploy.lftp --ignore-time --recursion=always
Then add an "Exec Shell" build action to execute lftp on the script.
lftp -f deploy.lftp
The lftp script will
mirror: copy all changed files
reverse: push local files to a remote host. a regular mirror pulls from remote host to local.
verbose: dump all the notes about what files were copied where to the build log
delete: remove remote files no longer present in the git repo
exclude: don't publish .git directory or the deploy.lftp script.
ignore-time: won't publish based on file creation time. If you don't have this, in my case, all files got published since a fresh clone of the git repo updated the file create timestamps. It still works quite well though and even files modified by adding a single space in them were identified as different and uploaded.
recursion: will analyze every file rather than depending on folders to determine if any files in them were possibly modified. This isn't technically necessary since we're ignoring time stamps but I have it in here anyway.
I wrote an article explaining how I keep FTP in sync with Git for a WordPress site I could only access via FTP. The article explains how to sync from FTP to Git then how to use Jenkins to build and deploy back to FTP. This approach isn't perfect but it works. It only uploads changed files and it deletes files off the host that have been removed from the git repo (and vice versa)