How does processmaker engine work? - workflow

After I finish the design of the process in the bpmn notation..processmaker transform the bpmn to xpdl to execute this process? or use bpel?

I've used ProcessMaker for 3 years, and it seems to me it doesn't use BPEL.
Check this: http://wiki.processmaker.com/index.php/ProcessMaker_Architecture_Diagrams
It doesn't mention anything about it BPEL or XPDL.
To execute the process, ProcessMaker generates code files and XML files, which contain the business logic you designed before using DynaForms.
So, it's not just designing the process using BPMN notation, you have to build data entry forms, derivation rules, create user groups, give them permission and even some custom programming.
This is not "magic".

The current version of ProcessMaker 2.5.0 is not BPMN or BPEL compliant. But the Roadmap of the product includes the BPMN compliant implementation (http://wiki.processmaker.com/index.php/ProcessMaker_RoadMap).
Currently the engine uses tasks, events, steps, dynaforms, input and output documents and triggers to execute processes.

Current version of processmaker has not a BPEL or BPMN engine. But processmaker can execute processes because have an engine. To execute a case you need to go to the inbox tab and start a new case, of course you need to configure user access at the designing time.

I don't know anything about XPDEL or BPEL but based on my experience, processmaker will store everything on their workspace database, that's why they use PMT_ prefixes if you create report table, for separate user created table and processmaker system tables. If you create case, processmaker will create CASE in table APP_DELEGATION with process, task, application (cases), user and anything related to your CASE.
So basically they will serve form based on APP_DELEGATION data, this table also stored every steps of CASES. If your submit your form, they will make a new row in APP_DELEGATION with sampe process and application but new TASK (TAS_UID) related to designer path(arrow on your screen).
Basically they just store information, serve it based on information, and route it based on your design. Even your uploaded file will be noted on processmaker databases system (they will create UID and other important information, even the uploader user information). And not compiling or translate it to another language. Simple but not so simple as that.

ProcessMaker's latest version (released in January 2020) - ProcessMaker 4.x- is fully BPMN 2.0 compliant. You can important and export BPMN 2.0 files from other BPMN 2.0 compliant designers into ProcessMaker.
BPEL is really no longer used by anyone in the industry. It lost support a long time ago.

In summary, ProcessMaker 4 requirements for a server can be seen at this link.
ProcessMaker still uses the stack for installation: apache or nginx, mysql database and php language. Aditionally, Lavaravel framework is used in ProcessMaker. ProcessMaker as a bmnp software needs complies with the BPMN 2.0 standards.

Related

How can I deploy form / subform (i.e. display only) changes on Notes databases?

I have been asked by a client to assist in making the web frontends of number of Lotus / IBM Notes databases, used for critical LOB functions, compatible with modern browsers.
As it stands, the web frontends of these databases only work in IE7, and even then they're temperamental at best. The JS uses IE-specific extensions, everything is in tables, and they render poorly on pretty much every browser available today. With IE7 no longer in support, they want to modernise these interfaces.
I have very little experience with Notes, but as an exploratory exercise I've managed to open up the databases in Domino Designer, add a few Stylesheet / Script resources, include them in the $$HTMLHead variable and reworked one Form to use a frontend framework, which looks good.
Obviously working on live applications is out of the question, so my thinking is to take a copy of the NSF files, and make the changes on the copies. My question is: how can I then deploy only the form / subform / resource changes to the 'live' NSF files?
Deployment:
In your new modified database :
You define in the Database properties that is a Database file is a master template (give a name)
In the production database :
first do a backup ! copy (only design) to a new copy of the prod
You define in the Database properties that it inherits from master template (same name)
on the prod make refresh design
more details : https://www.ibm.com/support/knowledgecenter/SSVRGU_9.0.1/com.ibm.designer.domino.main.doc/H_ABOUT_REFRESHING_A_DESIGN.html
Sorry to state the obvious, but since you have a Notes client and a Domino server, you have a quite extensive documentation at your disposal in the form of databases located in the /help/ directory. Make sure they are full-text-indexed.
And since we are on the subject of templates, Domino comes with a host of ready-made, ready-to-use apps that you can customize and canibalize. Look for discussion9.ntf for starters.
You may want to start here, then go there, and finally that will give you the keys to build word-class web apps on Domino.
Last thing, if you are on V9, the Designer help is crap. Grap a copy of the 8.5 version. Seriously.
If you want to build a modern web based front-end to existing Domino data, take a look at the following presentations:
http://www.slideshare.net/TexasSwede/ad102-break-out-of-the-box
and
http://www.slideshare.net/TexasSwede/break-out-of-the-box-part-2
As others already said, you should create a template and then just refresh/replace the design of the production database using that template.
You may want to consider working with an experienced Notes/Domino developer for that project, there are quite a few caveats and workarounds you need to know know about...

SAP BO 4.1 Auditing without Universes

Morning all,
I have recently deployed Crystal Reports 2013 and Crystal Server 2013 in a test environment, as we are currently using the 2008 version of both products.
As this deployment is in a test environment; I am keen to implement and try out as much as possible to implement the best solution as possible.
One of the things I have enabled is Auditing. Once set-up I went looking for the best way to utilise this but everything seems to allude to needing a Universe creation tool (Information Design Tool) which I don't have and can't obtain, as our SAP products are provided via a third party and we don't have access to the BI Client Tools.
So I'm back to trying to figure this all out via custom Crystal Reports.... I've read plenty of articles, one which provided me with the links needed between the Database Tables, but there don't seem to be any articles on what tables etc to use.
Has anybody done this?
Thanks in advance for any help, I'm tearing my hair out at the minute!
Direct RBMS access
Have a look at the official SAP documentation (I'm using the BusinessObjects manuals, but information in them should apply to Crystal Server as well), more specifically the Business Intelligence platform Administrator Guide (SP doesn't really matter, auditing doesn't tend to change much within a major release).
There are two sections that are important for you:
The Auditing chapter, more specifically the section regarding Audit events.
The Auditing Data Store Schema Appendix, which contains all the detail regarding the audit schema you could need.
Using a universe
Have a look at the SCN blog post Unlock the Auditing database with a new Universe and Web Intelligence Documents for BI4.1. It contains a download link to an LCMBIAR file which you can import into your BI4 environment, without the need for a universe client tool. There's also a webinar and documentation available. The prebuilt WebI documents should save you a lot of time.
Requirements:
BI Platform BI 4.1 Support Pack 5 or greater for the Web Intelligence Documents
BI Platform BI 4.1.x for the Audit database to be queried
Instructions:
Download the content (take the highest build numbered zip file)
Import one of the five 'Universe' LCMBIAR files into your system using Promotion Management (it will go into BI Platform Auditing folder)
Import the Web Intelligence LCMBIAR file (it will go into BI Platform Auditing folder)
Edit the connection that is imported (in BI Platform Auditing folder) with the correct login credentials.
Open the Web Intelligence document STA1 - Start here - Events over time.wid as your starting point!
The only issue might be with step 4, where you need to edit your connection. I don't know if you'd be able to edit the imported connection through the Crystal Reports application?

Beginner: Bluemix programming languages for analysis of IoT data

I am a h/w engineer interested in using Bluemix for an IOT application. Other than C, I do not know any programming language but I am willing to learn whatever necessary. My application is as follows:
My sensor nodes would upload data to an existing h/w server that has the capability to upload the data to an external SQL server. I want to analyze this data on the SQL server on a periodic basis and generate reports that I can publish to a mobile application or even a web-page to begin with.
Questions:
Is it possible to implement the "SQL server --> Data analysis --> Report generation + data visualization --> HTML(?) Publish" flow on Bluemix?
What modern/efficient languages can I learn in order to do this with the least effort?
Is there a standard implementation/example that I can use as reference for the flow described above?
This question actually has little to do with IoT--that just happens to be the source of the data--and focuses on how to process data for analysis, report generation, and publishing. You can do this mostly using services in Bluemix such that there's little if any code to write and so the programming language of the runtime may not matter.
First, to store the data, you could use SQL Database or dashDB. The former is "just" a database, whereas the latter includes R and R-Studio for data analysis. Second, for report generation, you can use Embeddable Reporting, which has Cognos (e.g. IBM Cognos Business Intelligence reports) built in.
The way Cloud Foundry in Bluemix works, you'll need to create a runtime with some language, then bind the service instances to it so you can use them. But you may not have any code to write, in which case the language doesn't matter. In case you do need to write some code, choose whichever language you think you can learn most easily. Java programmers prefer that, but it requires compiling; they may also prefer Go. You'll probably have an easier time with Node.js and PHP, which are popular interpreted languages.
A couple of resources for further info:
"Embed rich reports in your applications" shows how to use Embeddable Reporting with dashDB.
"Leverage IBM Cognos on IBM Bluemix using the Embeddable Reporting service" shows how to use Embeddable Reporting with SQL Database.
"Embed Reports and visualize Data in your Bluemix Applications" gives an overview of both approaches.
BTW, Bluemix also has a neat service called Internet of Things, which helps connect your Bluemix app to lots of things all over the Internet. Sounds like you already have this handled for this example, but as you continue to use Bluemix for IoT applications, you might want to look into this service too. The Internet of Things Foundation Starter helps you get started using Node.js, Cloudant, and Node-RED.

How to choose a web-based chart-visualization framework?

I want to build a website that displays data from an external databases. The data must be displayed in the form of charts, because charts are more expressive. I've never developed any websites yet, can anyone give me some advice about existing web frameworks and what are the advantages and disadvantages of them?
Which framework should I choose? The data are stored in an SQL Server. Because new data report types might be required in the future, the framework must be easy to modify and expand.
This is purely subjective and there are many answers to this. With that said I'll tell you what I'd use:
You mentioned that your database is using SQL Server. I'll assume that you'll want to host your website on a windows server for sake of argument. Given that, I'd choose either ASP.NET or ASP.NET MVC given that they are natural picks for a windows server hosting websites on IIS. Then I'd find myself a nice jQuery charting plugin see this SO question.

Web-based document merge solution?

We are looking for a web-based document merge solution.
Our application is a web-based project management tool built using Xataface - PHP on Windows IIS + mySQL. We have a function that allows the user to generate a status report in Microsoft Word format based on data in the tool.
Currently this function is implemented using LiveDocX. We have a status report template, and LiveDocX performs the merge into the template using data from our project management tool.
The main drawback is LiveDocx is web-service based. We are looking to replace LiveDocX in order to reduce our dependence on the up-time of a third-party web-service that we cannot control.
Does anyone have any suggestions on a web-based document merge solution that I can install on my IIS or PHP based server?
In reading your question, I'm not sure if what you're after is actual mail merge or just simply populating a single Word template server-side. In either case, here are some things to get your started:
For mail merge (multiple document
results based on multiple records,
say, from a database), you can use
either the Open XML SDK for a VS
solution or access the Open XML
directly. In that case, these are
good articles to start with:
Mail Merge in WordProcessingML
Mail Merging with a Custom Client
Using the Open XML SDK 2.0
For direct single-document
population with data, these are good
libraries that use PHP:
PHP Word
PHP OpenXML API (not incredibly
mature, but a starting point)
If you are still looking for solution then you can try Aspose.Words for .NET
it is a commercial component (not free).