Automate the process of uploading a new build to a website. i.e. npm run build -> cpanel upload - visual-studio-code

I am managing a mostly static site through GoDaddy.
The site is a React single page application, that is still currently under development, and that occasionally needs content updating. The project folder is hosted as a public git repository.
My goal is to be able to automate the process of updating the site. Currently I need to:
npm run build
navigate to the build folder in windows file explorer
navigate to the public html folder in cpanel, in my web browser
delete the current build files
upload the contents of the build files into cpanel, folder by folder (cpanel will not allow me to upload subfolders)
I have looked through countless forum posts, and blogs, etc to find a way to automate this, but I always end up doing it manually.

You need to investigate using continuous deployment (CD/CI) and using a different hosting setup. Unfortunately, the type of platform that you use (with CPanel) is limiting and is not really oriented at your use case.
However, CPanel does have an option to use Git version control to manage the files and folders in your account. Go into this option and say "Clone repository", where you'll have to link a repo and say where it should install. Note: It is possible that your hosting provider has disabled this feature.
I suspect that this CPanel feature does not automatically pull in changes when you update the repo, so you would probably still need to manually clone the repo again when you make changes (which is still easier than copying files over). Also note that any data you store may be removed when cloning again.

Related

How do i manage content workflow for hexo site?

I've used static site generators like jekyll and had it hosted through github pages just fine using prose.io as a content management system.
I decided I wanted to go with a site using Hexo static site generator, but I cannot seem to figure out a good workflow for publishing content.
To my understanding this is the following in how I'd have to do it:
write *.md text file
hexo generate
(optional) hexo serve (to see local content)
hexo deploy (to publish the public content to whatever site using config.yml). Can publish on amazon S3, github pages, etc
Is there another workflow other than this?
the way I've been doing it before with jekyll+github pages is simply
go to prose.io
Write content
save (which publishes ocntent)
Ideally I'd like to use hexo+github pages the same way I do with jekyll+github pages.
Basically, can github generate static files automatically like it does with jekyll / ruby packages?
I figured out my own answer and posted it on my blog
http://www.tangycode.com/Quick-Start-Guide-To-Hexo-Install/
It covers everything you need to know on how to set up a hexo blog site and manage content workflows
One approach I am trying myself:
prose.io or similar to write and save on github repository
travis-ci.org to build hexo site and deploy
This is how it works:
Edit document on your editor of love
Commit it to your repository
travis-ci.org detects the commit and start working
My .travis.yml do (among few other things) the follow:
npm install hexo-cli
npm install grunt-cli
npm install inside siteĀ“s repository (hexo plugins and dependecies)
hexo generate
grunt deploy-production
hexo deploy (I use this to have a historic of the site stored in the repo itself)
If your editor of choice can direct commit to github repository your have exact same experience you had with Jekyll on GH pages. The advantage here is that you can use third-part plugin what GH pages avoid.
Alternatively, you can use INSTANT, which is a content management tool that you can use on any static website. You just install their javascript and can directly edit content in your website (without any admin dashboard). It saves and serves the content from the client. Pretty neat.
The easiest way is to use a hosting provider like Netlify in combination with a headless CMS, for example Headless (full disclore: I created it).
Netlify can do the build process for you and during that build process, it can fetch content from a headless CMS. Whenever you update content in the CMS, then Netlify does a rebuild.
Then you have your website on the Netlify CDN, a real CMS for your content management and you never need to dive in your code or github files. And that's all for free.

Build an open source project from github (not mine) with a ci

There is an open source project (https://github.com/firebase/firebase-jobdispatcher-android), which I would like to get built using travis/circleci or another cloud ci. However, those CI's don't allow you to get to repos that are not yours.
I didn't try, but I have a hunch that I won't be able to get a webhook setup as well to get notified when those repos 'master' branch is updated.
Why not fork ? Because then I somehow need to manually\use cron server to get my forked repo updated! It loses the point of having open source repo builds...
Why do I want to build it continually? Because they do not upload their .aar output to mavencetral or jcenter and I don't want to put the .aars in my project and get it updated all the time - bloats the repo...
In any case, I don't get it - there's an open source project, the repo exists and open to everyone, pulling the data and getting webhooks doesn't compromise that repo in any way why isn't this possible ????
If I'm mistaken and web hook is possible, how can I set up a build that will end up in uploading to mavencentral (probably gradle plugin, I have an account and be happy to have a public copy there)?
(I thought of micro service, free of course of some kind + docker based ci which I can pull and build whatever, I don't mind if a build will take time).

Workflow for Working on Moodle As a Team

I and a friend want to work, remotely, on a Moodle website. I have got the application installed locally, but I'm not sure what I need to commit into our git repository and what I shouldn't.
I'm looking for a workflow that allows someone to clone the code, run a CLI command or something, and be up and running. Since our machines are development machines, I"m trying to lower the number of steps (ie, we can both just share the same configuration file).
You may want to take a look at Moosh ( https://github.com/tmuras/moosh ).
This can script-up a lot of the things I think you want to do (set settings, create courses + users, etc).
You could then create a command-line script that would call Moosh and prepare a lot of the settings that you will eventually want on your live site (what it cannot do is take the settings from an existing site and apply them automatically to a new site).

What grunt files to upload to repo vs files to upload when deploying site to production

So, I have a webapp I am creating using the 3 muskateers yeoman, grunt and bower.
My questions are:
What is best practice when it comes to uploading my webapp into a git/mercurial repo? Do I include the entire project? What about directories like 'node_modules' or 'test', etc?
Also, when deploying to live production site: Will my 'dist' folder be what I should be uploading?
With research yielding no results (I could be searching the wrong things?).. I'm a bit new to this process so any feedback is greatly appreciated. Thanks!
You should always commit all of your yeoman, grunt, and bower config files.
There are two schools of thought on committing the output they produce or dependencies they download:
One is, you should upload everything needed for another user to deploy the web app after cloning the repository, without performing any additional operations. The idea is, dependencies may not exist anymore, network connections might be down, etc.
Another is, keep the repository small and don't commit node_modules, etc, since they can be downloaded by the user.
As far as the dist folder goes, yes you'll be uploading it to your server, as it contains all of your minified files. Whether or not you want to commit it to the repository is a separate question. You might let the user build every time, assuming they can get all the dependencies one way or another (from above choice). Or you might want to commit it to tag it with a release version along with your source code.
There's some more discussion on this here: http://addyosmani.com/blog/checking-in-front-end-dependencies/

Eclipse / Aptana File Sync Solutions

Our development team uses Eclipse + Aptana to do their web development work. Currently, most of them are mapping their Eclipse projects directly to the web server. I'd rather them create a local project and use that to sync to the web server project directory they are working on.
The issue is that there aren't any good solutions which is just appalling given the popularity of the two.
The FileSync plugin for Eclipse is only one-way. Meaning if another developer makes a change to the file on the server, another dev isn't even notified and could overwrite the change.
The File Transfer option in Aptana 2.0 doesn't support any sort of Sync, just manually uploading/downloading files.
The Sync option in Aptana 1.5.1 doesn't allow you to merge files when they are different. You can only update one or the other. It does however allow you to view a diff (but only if you right click and select) and in that diff you can't make any changes.
I did find a way to allow files to be uploaded to their Sync repositories in Aptana using Eclipse Monkey. However it doesn't work if a user saves multiple files at once, 'Save All', again it doesn't work. And additionally, there is no notification if a user opens a local file that has an updated copy on the server. I tried to add one using Eclipse Monkey but I couldn't find any sort of listener in the Eclipse API to do it and any Eclipse Monkey documentation is far and few between.
My only solution at this point is just to let them continue to map directly to the server or ask them to do a manual download before they do any work (but again what if someone uploads a change right after they do that).
Anyone have any ideas?
April 2010
Add EGit to your Eclipse+Aptana setup, and:
let developers push to a local bare repo their developments (see also this post)
let your local project be updated by a git pull from that same local bare repo (creating/updating) a local working directory with sources merged/updated (or by using a post-update hook as described in my previous SO link)
let your local Aptana+Eclipse(+EGit) reference that local working directory, also used by your web server.
In short, when you are speaking of file synchronization + merges, this is a job for a (D)VCS (Version Control System: Centralized or Distributed VCS)
Oct 2011: as xmedeko mentions in the comments, Aptana3 has its own Git plugin.
And it isn't very compatible with EGit: See bug 1988.
Adding to VonC answer (which is correct IMHO), what probably lies beneath this scenario is that the process you adopted is not correct in itself, apart from the tools used.
If I understood well, you should not allow nor perform a direct upload from a development version of the project to the web server. Merging is not a job for remote synchronization tools, and it should happen well before the deployment phase (upload to web server is practically a deploy).
You should have a dedicated repository taken from some point in development history (according to you release timeline), a point where merge has already happened. Then deploy it (by means of file synchronization if you want, but that is not mandatory) on a local/staging web server.
Perform there any test you run on the web site actively running (i.e. integration and/or functional tests). If there's any bug & fixing, well there are different ways to actually apply the fixes on development & staging code repository. Only after that, you deploy the staging repository on to production web server (again, synchronization tools are a way to do that).