Moodle 2.x task automation for create / copy course - moodle

is there a good clean way in moodle to programmatically create new courses and copy old courses? At the moment I try to figure out whats possible with moodles webservices, since the cli scripts are lousy. But it seems that with webservices only the creation of courses is possible. Copying one is not a possible task.
Thanks

For a site administrator, automated course backups are more expensive in terms of time, CPU usage and storage. The recovery time to have a site running again takes longer than a site backup. However, teachers and site administrators might find a course backups as a way to create a "fresh" copy of a course that can be re-used.
Use Default Moodle Functionality Backup & Restore, it's easy for you, and for create course you can also use web service.

Related

PostgreSQL: How do I setup a local server / client environment for initial database experimentation

I've recently decided to embark on a fun / educational personal project to create some data visualizations and power metrics for my fantasy football league. Since ESPN doesn't provide an API, I've decided to use a combination of elbow grease and the nfldb to pull relevant data (and am hoping to get familiar with Plotly for presenting the data). In setting up nfldb, I'm also getting my first exposure to databases, using postgresql in particular (as required by nfldb).
Since the installation guide provided by nfldb is Linux-centric and assumes a fair bit of previous database experience, I've looked to this guide for help and blindly followed its instructions in hopes of sidestepping postgresql (aka the "just make it work" "solution"). Of course, that didn't work, and I have no idea how to diagnose the problem(s), so I've decided to go ahead and use this opportunity to get a little familiar with databases / postgresql.
I've looked to the postgresql documentation for guidance. Having never worked in a server / client environment, the following text (from "18.1. The PostgreSQL User Account") has me particularly confused:
As with any server daemon that is accessible to the outside world, it is advisable
to run PostgreSQL under a separate user account. This user account should only own
the data that is managed by the server, and should not be shared with other
daemons. (For example, using the user nobody is a bad idea.) It is not advisable
to install executables owned by this user because compromised systems could then
modify their own binaries.
To add a Unix user account to your system, look for a command useradd or adduser.
The user name postgres is often used, and is assumed throughout this book, but you
can use another name if you like.
I'd really appreciate a well annotated version of these paragraphs. How does it apply to someone like me, storing and accessing date on the same machine? Do I need to create a new system user account? How do I make sure it "only owns the data that is managed by the server"? Where is the responsible location to install postgresql? Am I exposed to some sort of security risk by downloading the nfldb database? Why is the user nobody a bad idea?
Relevant: I am using a Mac (v10.11.6) and plan to install (or re-install, if necessary) postgresql using Homebrew.

Scheduled export of data from Dynamics CRM

I hope someone can help me. I would like to find a way to do a scheduled export of select data from Microsoft Dynamics CRM (Online)
Preferably to a CSV file and have the export automated at a recurring time (at least once a day) so it does the export to a specified location without any user interaction.
I'm aware of Scribe for example but that is very expensive and I need a cheaper solution. Any ideas for scheduled and unattended exporting from Dynamics?
As #Guido Preite mentioned, your best bet is to get the CRM SDK. Since cost is an issue with turn-key third-party software, the SDK is a good alternative if you have a little time to get familiarized with it. There are a lot of good examples straight from MSDN and the SDK documentation to get you up and running quickly, start here. Basically what you could do is create a simple console app that queries the data you need, then save it off to a file. This could then be scheduled via Task Scheduler.
Scribe is a good solution, but isn't cheap as you say.
I've used KingswaySoft to do scheduled data imports and exports with CRM. See http://www.kingswaysoft.com/products/ssis-integration-toolkit-for-microsoft-dynamics-crm.
It's a good product and is cheaper than Scribe. No coding is required, although you'll need some experience of SQL Server Integration Services (SSIS).
Flatly lets you auto-export data from Microsoft Dynamics Online to CSV (placed in cloud storage) every 10 minutes, hourly or daily. It takes 5 minutes to setup. flatly.io
Disclosure: I work at Flatly.

Heroku database backup storage when database is not active

I'm considering using Heroku as a platform for a project I'm working on. This project will have many independent databases (postgres). Each database will spin up when someone is using it, then save the data to a dump file and spin down when no one is logged on (if all these databases are always active it will be colossally expensive).
Unfortunately I have no experience with Heroku and their documentation has an annoying marketing slant to it--I can't figure out if this is possible. How do I pay for the storage of backups? Is it possible to store backups without an associated running database?
My alternative is to build this on Amazon, but I'd rather not do all this engineering myself.
Many thanks in advance.
The Postgres Schemas approach might fit well for your example of multi-tenacy.
This Blog Post and RailsCast might help you further.
Spinning up multiple databases sounds like fighting the defaults, for which concerns?

What is the best way to setup a development & production environment for a PHP/MySQL app?

I've been developing a web app locally on my local MAMP computer for the last few months. Now I am ready to launch it while continuing to add enhancements/fixes. So, I am wondering what is a good way to implement a development AND production server in order to efficiently manage updates, prevent overwrites, and seamlessly add other developers into the workflow. I also want something that has a minimal learning curve for me. Personally, for whatever reason, I've never been able to fully grasp version control systems like Git or SVN so I am hoping for an easier solution until I am able to invest more info the business.
As I see it, the options that I have are:
Spend more time learning Git before launching. And hoping that I don't break anything while further developing my app.
Buy two hosting accounts. One for Dev and one for Prod, where only I can do the deployments into Prod. I suppose I'd have to keep track of all files we've modified in a spreadsheet that are deemed ready for deployment.
Editing right on the FTP (no Dev server).
Are there any other options that you can recommend? I've heard that there are some new types of Web Hosting companies that can do the heavy lifting...
While personally, I have had good experiences using svn/git for multi-developer websites, I can understand your reticence to start relying on something you are not entirely familiar with. Unfortunately, I do believe that is your best option, but failing that, you might try using subdomains. My former employer would create test area on the disk and point beta.thedomainname.com at it. When bug fixes or upgrades were complete and verified to be working in the beta directory, the entire directory would be copied over to the live domain. Not the most elegant solution, but it worked. It certainly is cheaper than buying two hosting accounts.

Load CMS core files from one server from multiple servers

I'm almost done with our custom CMS system. Now we want to install this for different websites (and more in the future), but every time I change the core files I will need to update each server/website seperatly.
What I really want is to load the core files from our server, so if I install an CMS I only define the nedded config files (on that server) and the rest is loaded from our server. This way I can pass changes in the core very simple, and only once.
How to do this, or this a completely wrong way? If so, what is the right way? Thing I need to look out for? Is it secure (without paying thousands for a https connection)?
I have completely no idea how to start or were to begin, and couldn't find anything helpful (maybe wrong search) so everything is helpful!
Thanks in advance!
Note: My application is build using the Zend Framework
You can't load the required files from remote on runtime (or really don't want to ;). This problem goes down to a proper release & configuration managment where you update all of your servers. But this can mostly be done automatically.
Depending on how much time you want to spend on this mechanism there are some things you have to be aware of. The general idea is, that you have one central server which holds the releases and have all other servers check if for updates, download and install them. There are lot's of possibilities like svn, archives, ... and the check/update can be done manually at the frontend or by crons in the background. Usually you'll update all changed files except the config files and the database as they can't be replaced but have to be modified in a certain way (this is the place where update scripts come into place).
This could look like this:
Cronjob is running on the server which checks for updates via svn
If there is a new revision it'll do a svn-update
This is an very easy to implement mechansim which holds some drawbacks like you can't change the config-files and database. Well infact it'd be possible but quite difficult to achieve.
Maybe this could be easier with a archive-based solution:
Cronjob checks updateserver for a new version. This could be done by reading the contents of a file on the update-server and compare it to a local copy
If there is a new version, download the related archive
Unpack the archive and copy the files
With that approach you might be able to include update-scripts into updates to modify configs/databases.
Automatic updatedistribution is a very very complex topic and that are only two very simple approaches. There are probably very many different solutions out there and 'selecting' the right one is not an easy task (it does even get more complex if you have different versions of a product with dependencies :) and there is no "this is the way it has to be done".