Moodle service hooks and integrating with another system - moodle

Does Moodle support service hooks (or some kind of trigger) so that another system can be updated when events are triggered by users.
The company we'd like to use Moodle for already have a large db storing all their learning and course management, but would now like to take a lot of their courses to an e-learning implementation. They however cannot move away from their current system as not all courses and learning will be e-learning, so my intention is to keep the current system as the primary store for all these details, but use Moodle to provide the e-learning side of things, feeding information back to the primary store.
If a user on my Moodle site completes a task, or signs up for a course etc, is there a way to "broadcast" this event to the other system running on another environment, providing all the details of the event? I'm hoping to create a REST API in the current system that will handle http requests from my Moodle site.
Many thanks for any feedback

You can use the events api - https://docs.moodle.org/dev/Events_API
Create a local plugin eg:
/local/external_update
Create an events.php
/local/external_update/db/events.php
Which has something like this - eg: the user_enrolled event
$handlers = array (
'user_enrolled' => array (
'handlerfile' => '/local/external_update/lib.php',
'handlerfunction' => 'local_external_update_user_enrolled',
'schedule' => 'instant',
'internal' => 1,
),
);
Then in /local/external_update/lib.php have
function local_external_update_user_enrolled($eventdata) {
// Do some rest stuff.
}
Then create a version.php and install the plugin then the event handler will be registered.
There are a few events already but if you can't find an existing one then its easy to create an event - just add it to the core code if necessary.
events_trigger('event_name', $eventdata);

Related

Clean architecture and user logins in Flutter - how do I store user information?

I've been trying to use Reso Coder's Flutter adaptation of Uncle Bob's Clean Architecture.
My app connects to an API, and most requests (other than logging in) require an authentication token.
Furthermore, upon logging in, user profile information (like a name and profile picture) is received.
I need a way to save this data upon login, and use it in both future API requests and my app's UI.
As I'm new to Uncle Bob's Clean Architecture, I'm not quite sure where this data belongs. Here are the ideas I've come up with, all involving storing the data in a User object:
Store the User in the repository layer, in an authentication feature directory. Other repository-level methods can pass it to the appropriate datasource methods.
This seems to make the most sense; other repository-level methods that call other API calls can use the stored User easily, passing it to methods in the data source layer.
If this is the way to go, I'm not quite sure how other features (that use the API) would access the User - is it okay to have a repository depend on another, and pass the authentication repository to the new feature repository?
Store the User in the repository layer, in an authentication feature directory. Other (non-login) usecases can depend on both this repository and on one relevant to their own feature, passing the User to their repository methods.
This is also breaking the vertical feature barrier, but it may be cleaner then idea 1.
For both these ideas, here's what my repository looks like:
abstract class AuthenticationRepository {
/// The current user.
User get currentUser;
/// True if logged in.
bool get loggedIn;
/// Logs in, saving the [User].
Future<void> login(AuthenticationParams params);
/// Logs out, disposing of the [User].
Future<void> logout();
/// Same as [logout], but logs out of all devices.
Future<void> logoutAll();
/// Retrieves stored login credentials.
Future<AuthenticationParams> retrieveStoredCredentials();
}
Are these ideas "valid", and are there any better ways of doing it?
I see another option to tackle the problem. The solution I want to talk about comes from the domain-driven design and is an event based approach.
In DDD you have the concept of a bounded context. A business object (uncle bob's entity) can have different meanings in different bounded contexts. Take a look at your user business object. The data and methods that some use case uses is often differnt to the data and methods that other use cases use. That's why you have differnt user objects in differnt bounded contexts. They are a kind of perspective that each use case has on the same business object.
If a business object is modified in one bounded context it can emit a business event. Another feature can listen to those events. The event mechanism can either be a simple observer pattern or if you need to distribute your application features via microservices a message queue. In case you use a simple observer patter the event emitter and event handler can run within the same data source transaction. But they can also run in differnt ones. It depends on your needs.
So when the sign-up use case registers a new user it emits a UserSignedUpEvent. Other features can now listen to this event. The event carries the information of the user, like the email, the name, the profile image and other infomration that the user provided during sign-up. Other features can now save the piece of data they need to their own data source. It can be the same as the sign-up use case uses (just other tables or another schema). But it is also possible that is a completely differnt data source, maybe another kind of data source like a nosql db. The part I wrote above about transactions is of course more difficult if you have different data sources.
The main point is that each feature has it's own data and manages it. It might be a copy of the whole user infromation, but in a lot of cases it is only a subset.
The event-based approach can give you perfect modularization. But as it is always when something looks great, it comes at a cost. You have to duplicate some part or even all data. When you think of a microservice architecture and some features are in different microservices it means that the duplication increases the availability of the service. The service can operate even the main service that manages the data is down, because a local copy exists. But now you have to deal with consistency issues - eventual consistency.
At this point I like to stop and guide you to other sources for details:
Chapter 8: Domain Events, Implementing Domain-Driven Desing, Vaughn Vernon
The many meanings of event-driven architectures, Martin Fowler

Flutter – question about architecture, providers and fetching data from server

I am a rather fresh Flutter programmer so please excuse any flaws in the questions below…
I am struggling with a structural/ architecture dilemma. Here is the background:
App rationale:
my app allows its users to check little jobs available in their area and if they find time and are in a proper location to execute the job for a remuneration,
the app uses standard REST API (not Firebase) so that the server cannot be relied on sending status change notifications to trigger re-fetching of data,
the critical elements are (1) up-to date list of jobs for a given address - other user may have already taken on a job in an address (timed refresh of list e.g. every 5 mins), and (2) the app needs to keep track of the user’s location and accordingly ask the server for jobs if the user relocates by more than 2km in less than the refresh time,
The challenge:
I guess that on the basic level the app should have the following providers: (1) auth – providing the authToken, (2) geolocation – regularly checking user’s location, (3) jobList - for particular location (fetches high level job descriptions and addresses(, (4) jobDetails – fetches exact instructions for carrying a particular job,
as you can see: (2) geolocation and (3) jobList – need to refresh programmatically (at interval or on some change of geolocation), while (1) auth, (4) jobDetails are triggered by the user.
The Big Question ;) is … what is the proper architecture for the above type of app? More specifically:
should I use services for connecting to the server API and these would in turn be used by the providers?
how to ensure programmatic refetch of jobList on timer and relocation event from geolocation?
how to continually listen to location changes to detect a relocation but not overwhelming the app with processing?
should I store the (quickly outdating) jobLIst data just in its object class or should I use settings provider or a local db or maybe there is an easy way of storing the latest JSON response not to have to build the settings provider or db mapping?
in all my call to Auth api I need to provide the deviceId - how to make it available accross the app - this is pretty static but is needed in authentication so should checking it be a part of the auth provider?
If you could comment on the above or suggest a source of relevant examples I would be really grateful.
Thanks and cheers!
Here are my thoughts:
how to continually listen to location changes to detect a relocation but not overwhelming the app with processing?
You can rely on third party to do this for you. Such as: geolocator. With this, you can specify the amount of distance the user must have moved before the package notifies you of the change in user location.
should I store the (quickly outdating) jobLIst data just in its object class or ...
Since it is likely for a job listing app to use this data often and in various places, I would prefer to use db. It would be helpful in the long run too, if you plan to have some sort of analytics done on the mobile end or to gather any insights.
in all my call to Auth api I need to provide the deviceId - how to make it available accross the app ...
When you app is initialized, you could fetch the deviceID and store it in shared_preferences. Then in auth api, you could just retrieve it before making the API call.
should I use services for connecting to the server API and these would in turn be used by the providers?
As for geo location, geolocator can update you about the change in location and you could make an API call based on that.
However, if you plan to have a timer based approach to refresh your job listing, then you must realize that your users are likely to face issues arising from your inconsistent data. If you have plans to tackle it, then this implementation here might help. But I strongly feel that server supporting push notifications or maybe a web socket approach would be ideal here.

Sharepoint Online remote event receiver without App/Add-in

The company I work for uses SharePoint Online. We have a requirement that on most site collections, whenever a user creates a new document library that the document library is configured with the "document" content type being removed, and replaced with some of our own corporate content types.
Previously I've managed this by using a coded sandbox solution installed on relevant site collections which had an event handler that fired on "list added". It's obviously now time to move away from that solution.
I'm really struggling to get to grips with the alternative, conceptually. I'm aiming to replace the old solution with a Remote Event Receiver solution.
The way I think I'd like to achieve this:
1) Create a single remote event receiver hosted in Azure which receives details of a new list being added in a site which it then configures appropriately.
2) Use CSOM to provision the site and as part of that provisioning, hook up the event receiver.
I've spent a lot of time on this, getting nowhere. I initially thought the answer lied in using an App which I could install in the App Catalog and then push out to particular site collections but that doesn't seem to be right.
Is the solution above possible? All examples on the web I've come across of setting up remote event receivers seem to use a SharePoint app which I don't really want to do.
Thanks.
For info I found the answer. You can indeed create a remote event receiver without a SharePoint app/add-in.
The answer was written up here
I thought I needed a SharePoint Provider Hosted App for that part 1
But you should bear in mind that as per Remove event receivers on host web clientContext you will not have the client Context passed through, so
TokenHelper.CreateRemoteEventReceiverClientContext(properties)
...will come through as empty. If you want to interact with SharePoint then you'll need to find another way than this approach, or use a different set of credentials.

How can i update tags for a Azure NotificationHub installation via the backend, without an installationId?

I'm using the Installation approach to handle my push notifications.
Works great!
However, how can i update tags for an installation via the backend, without knowing the installationId?
E.g with Registrations, i can do this:
_hub.GetRegistrationsByTagAsync("member:1")
To then update all registrations for that user, via my backend.
But how do i do this with installations? There is no equivalent GetInstallationsByTagAsync, there is only GetInstallationAsync, which needs the installationId (which i don't have, since this a backend/background operation).
Notification Hubs does not have a getInstallationByTag method available at this point in time so Installation IDs would be necessary. It is on our roadmap to provide a complete set of registration equivalent support for installations.

SugarCrm - How to send e-mail to manager every time a Lead is updated?

I'm new to SugarCRM.
I've google but I can't find a solution for this. Every time a Lead is UPDATED (New Log Call, New Email Attached, New meeting, Status Change) I need to send an e-mail.
How can I do this in SugarCrm?
Best Regards,
André
While the benefits of blasting a person with an email upon every update can be argued against, what you will need to do is use an after_save logic hook on the Leads module. Within that hook you would then send an email if it is an existing lead.
Helpful links:
http://support.sugarcrm.com/02_Documentation/04_Sugar_Developer/Sugar_Developer_Guide_7.6/60_Logic_Hooks/20_Module_Hooks/after_save/
http://developer.sugarcrm.com/2011/02/14/howto-detect-record-state-in-a-logic-hook/
http://developer.sugarcrm.com/2011/03/01/howto-send-an-email-inside-sugar-thru-code/
If you are using professional version you need to create a workflow and customize the email template (everything using sugar interface in admin panel).
If you are using the community version you need to create logic hooks file and create you logic manually. I suggest you create one $old_lead in before save hook and compare this $old_lead with actual $bean and see if have one changes.