Hello Im new to moodle plugin development and I want to some changes on the current plugin. My problem now is how can I test the moodle plugin I am developing.
Install a local copy of Moodle ( https://docs.moodle.org/en/Installing_Moodle ) or use a Docker image ( https://github.com/moodlehq/moodle-docker ).
Add your plugin code to the relevant subdirectory with the locally installed copy of Moodle.
Configure the site with some suitable test data (or take a copy of your production database and remove user email addresses). Log into the site with suitable user accounts and test your plugin.
Ideally, you should also write (and run) both Unit tests ( https://docs.moodle.org/dev/PHPUnit ) and acceptance tests ( https://docs.moodle.org/dev/Running_acceptance_test ).
Related
I'm trying to set a Local Development Server for Lightning Web Components according to this link but when I try to install the plugin #salesforce/lwc-dev-server I get this error message :
Code: ShellParseError
and a .js file named npm-cli.js opens in my editor with this content:
#!/usr/bin/env node
require('../lib/cli.js')(process)
Anyone knows what to do? Thanks in advance.
In general, the local development for Lightning Web Components still has beta status: Local Development (Beta)
However, even the beta version can now be used relatively reliably. To set up local development you only need to authorize an org and install the development server. This allows you to develop locally without the need to push your components to an org first.
The local development server and its configuration are provided by a Salesforce CLI plugin. Before you install the plugin make sure you are using the latest Salesforce CLI version by running:
sfdx update
Then the lwc-dev-server plugin can be installed as follows:
sfdx plugins:install #salesforce/lwc-dev-server
After installing the plugin, to start the server on http://localhost:3333 and access all components of the project just run:
sfdx force:lightning:lwc:start
There is even a short official guide on how to set it up: Set Up LWC Local Development
I am able to test out my confluence cloud plugin with the base url set to:
{{localBaseUrl}}
This loads the plugin fine on my confluence test environment that's hosted on my_project.atlassian.net...
When I update the baseUrl in my atlassian-connect.json it's unable to load the same plugin when I run npm start and have the plugin files uploaded to the S3 account (It throws this: Tunnel 0e6e0931.ngrok.io not found). it's able to create the marketplace listing though.. How do I test my plugin to make sure it'll work in production?
Turns out if I create a private listing, I can view the app page, grab the install url and install via UPM. This will verify if the production app works before going live.
I have a client site as a published dnn website hosted on a local server in the organization's premises.
I have developed a new module using the source code that fetches data using a stored procedure.
This is what I did to transfer the changes to the live site
1. I created the stored procedure manually on the live sites DB,
2. I published the site and replaced the content of the website folder with my published site.
3. I added the new module in the live site using the "Module Definitions" option found under the "Host" menu
Everything works perfect in the development environment, but When I publish the site and do the update on the live server , the module returns no data as expected.
Can someone guide me through how to moves the stored procedure and my new published site to the live site.
I would recommend creating your module as an installable zip file including SQL for creating the stored prods and any other db changes required.
There are a number of tutorials around that will help with this and the templates build by Charles Nurse are a great help in getting your head around good practice. (http://www.charlesnurse.com)
In the mean time have you checked the permissions on the SP in the live database. That may well be your issue.
James
I am developing an application which has customer specific configuration (2 text and 2 binary files). The use case supposes that customer downloads an installation package (I am going to use install4j) and install it on target platform (Mac or Windows). So all installation packages should be different for different customers.
I am considering 2 possible scenarios for implementation:
Generate new installation package per customer request on server side (cons: I need to have install4j for Linux, which is server platform)
Have a half-generated installation package and inject customer data somehow to the package by customer request (cons: I am not sure this is quite possible at all)
I never used install4j before and don't know how to implement 1 or 2. Their documentation is far from ideal. They doesn't have examples or consider cases like this, so any suggestion is very appreciated.
You cannot modify an installer after it has been built. The main reason is that it would break code signing. So you would need to generate a new installer for each configuration. If you deploy on Mac OS X and Windows, you need install4j Multi-Platform Edition which also works on Linux.
Alternatively, you could ask the user to provide credentials in the installer, then you could download the appropriate files on demand with "Download file" actions.
We are working with various cloud platform(like. salesforce etc) and we need sync with server everyday. would like to know is there way that we can in our development box to synchronize all eclipse projects through some script without opening the IDE and open the IDE without much freezing.
This would enable to do clean sync( with cloud server) and refresh with local files.
This would enable to do refresh( for non cloud server ).
running a little ant or some kind of script would have development stable unique environment across all developers?
Any help would be appreciated.
It's going to GREATLY depend on what cloud platforms you are using. HOWEVER, i work with the salesforce platform. They offer (per their dev. docs) an ant API jar that allows you to write ant scripts that can essentially check out everything in your org.
Essentially you can use it to check out and check back in pieces and parts of the website. Though this of course only works for SFDC. For other platforms you will need to refer to their API's or write your own tools.