General rule for what can be inside of a chain in Spring Integration - jpa

I need to use a JPA retrieving outbound gateway and I would like to chain it with a header enricher. Trying to do this I went over the documentation to understand if there is a general rule to know if an endpoint could be inside of a chain or not. I couldn't find the rule of thumb on what can be or not.
My particular case I'm trying to do this:
<int:chain input-channel="audTrailRetrievalChannel" output-channel="updateResponseForTestSent">
<int-jpa:retrieving-outbound-gateway id="getAudTrail" jpa-query="select e.details from AudTrail e where e.audTrailRecId = :id" entity-manager-factory="auditEntityManager">
<int-jpa:parameter name="id" expression="payload?.body?.response?.responseInformation?.communicationVariables?.variable.?[variableName=='audTrailRecId'][0]?.variableValue"/>
</int-jpa:retrieving-outbound-gateway>
<int:header-enricher>
<int:header name="registerMethod" value="registerAuditTrail" overwrite="true"/>
</int:header-enricher>
</int:chain>

You can put anything within a chain but an outbound channel adapter or router (anything that does not have an output channel - implement MessageProducer) must be the last element.
http://docs.spring.io/spring-integration/reference/html/messaging-routing-chapter.html#chain

Related

How to limit transaction scope to just the JPA updating outbound gateway (sort of "auto-commit") in Spring Integration

I have an integration flow which starts with a poller. This DOES NOT open a transaction.
Down that flow I have a JPA updating outbound gateway:
<int-jpa:updating-outbound-gateway
request-channel="requestChannel"
reply-channel="replyChannel"
named-query="myUpdatingJpqlQuery"
entity-manager-factory="entityManagerFactory">
<int-jpa:transactional
transaction-manager="transactionManager" />
<int-jpa:parameter name="param1" expression="payload" />
<int-jpa:parameter name="param2"
expression="T(java.time.Instant).now()" />
</int-jpa:updating-outbound-gateway>
This works, however, the transaction that this gateway opens embraces the whole downstream flow (unless I break the transaction boundary with an executor). This is not what I want in this case: I would like the transaction to just embrace the updating operation (some sort of "auto-commit"), because the downstream flow will handle transactions in a more granular way and needs to handle independent transactions (not to join an outer one).
This indeed seems to be confirmed by the documentation: https://docs.spring.io/spring-integration/docs/5.4.11/reference/html/messaging-endpoints.html#tx-handle-message-advice
If I understand that linked section well, if I instead use a <request-handler-advice-chain> with a <tx:advice> I should get the desired result.
However, if I use this:
<int-jpa:updating-outbound-gateway
request-channel="requestChannel"
reply-channel="replyChannel"
named-query="myUpdatingJpqlQuery"
entity-manager-factory="entityManagerFactory">
<int-jpa:parameter name="param1" expression="payload" />
<int-jpa:parameter name="param2"
expression="T(java.time.Instant).now()" />
<int-jpa:request-handler-advice-chain>
<tx:advice transaction-manager="transactionManager" />
</int-jpa:request-handler-advice-chain>
</int-jpa:updating-outbound-gateway>
I get a javax.persistence.TransactionRequiredException, so it seems like that advice is not working (at least not in the way I want).
What is the better way to do this? Am I forced to use a dispatcher with an executor on the reply channel to break the transaction boundary?
Make it like this:
<tx:advice>
<tx:attributes>
<tx:method name="*"/>
</tx:attributes>
</tx:advice>
It does not have any method matches by default therefore such an advice is not applied to the internal AdvisedRequestHandler.

How does sw-toolbox router rule order/priority work?

In my service worker (which uses sw-toolbox library) I have setup two routes as follows:
toolbox.router.any("/user/*", toolbox.networkOnly);
toolbox.router.any("/user/logout", toolbox.logoutHandler);
I assumed that the second rule which is specific to the "/user/logout" path, would act as an exception to the first rule (which is a blanket rule for the path "/user/*") however, I can confirm that it does not.
Am I using this sw-toolbox route config correctly?
I think the rules are independent, first matching rule wins. So this should works:
toolbox.router.any("/user/logout", toolbox.logoutHandler);
toolbox.router.any("/user/*", toolbox.networkOnly);
See Jeff's comment on this issues: "The routing to handlers should match in the order they're registered"

What is ATG pipelines and how does it works?

I have some idea on atg droplets, dsp tags and writing custom droplets. I would like to know about pipelines on ATG topics. When I trying to refer oracle documentation for this I'm getting bit confused with understanding what it is and working flow of it. Can I create one custom pipeline manager which executes my custom processors sequentially.If possible how can I do this?? How to trigger my pipeline manager from my jsp page. Please guide me some tutorials or online documents for best learning for pipelines.
Code snippets is highly preferable.
Thanks in advance
A pipeline is an execution mechanism that allows for modular code execution. Oracle ATG Web Commerce uses pipelines to execute tasks such as loading, saving, and checking out Orders .The PipelineManager implements the pipeline execution mechanism.
There are two request-handling pipelines used by Dynamo.
• DAF Servlet Pipeline - It is used to handle the JSP request.
• DAS Servlet pipeline - It is used to handle JHTML request. Because JHTML is a proprietary language, it relies on the page compiler provided in the DAS servlet pipeline to generate JHTML into a servlet that’s rendered as HTML by the application server.
And also there is something called commercePipeline which takes care of order processing.
Request-Handling pipelines and commerce pipelines works in different ways.
DAS/DAF(ie., request pipelines)
It's a configuration defined with a series of servlets executed in a sequence on basis of each servlet's output. One of Dynamo's most important tasks is handling HTTP requests. In handling these requests, Dynamo uses session tracking, page compilation, Java Server Pages, and other powerful extensions to the basic Web server model.Request handling can usually be broken down into a series of independent steps. Each step may depend on additional information being available about the request, so order does matter. However, the individual steps are separable. For example, a typical request might go through these steps:
1) Compare the request URI against a list of restricted directories, to make sure that the user has permission to access the specified directory.
2) Translate the request URI into a real file name, taking "index" files into account when the file name refers to a directory.
3) Given the file name's extension, determine the MIME type of the file.
4) From the MIME type, dispatch the request to the appropriate handler.
So DAF/DAS pipelines comes into picture when there is a request. In atg_bootstrap.war web.xml has the information about the server startup.
When the server starts NucleusServlet.java gets loaded in the app server. This class initializes nucleus and other components and then adds all of them to nucleus namespace. And when a web application is accessed (DynAdmin,CRS,MotopriseJSP), nucleus routes the flow to either daf/das pipeline. If the application MIME type is Jhtml then das pipeline processes the request further. It is routed to DynamoProxyServlet class where it does the further processing by calling list of servlets. And if it is .jsp then Daf pipeline handles the further requests by calling PageFilter class.The reason for using a filter but not a servlet to invoke DAF pipeline is:
JSP pages and fragments are handled by the application server meaning that JBoss, WebLogic , WebSphere is the one responsible for compiling and executing the resulting page code. The best way to hook into this process is by using a Filter. For JHTML pages, that's a different story since the app server (not all app servers)can't parse and compile the pages. A servlet is used to redirect the request down the DAS pipeline where the page can parsed and executed by the ATG page compilationmechanism.
In case of Commerce Pipeline:
Pipeline Manager implements commerce pipeline functionality by reading a pipeline definition file ie., commercepipeline.xml. When application is deployed ,Nucleus initializes pricing engine, where OrderManager initializes pipelineManager. OrderManager.processOrder method invokes the pipeline chains in commercepipeline.xm. A pipeline chain would have processors which are simple java classes doing small operations .This xml can be extended by adding a custom processor. But in cases where a single processor need to be called, call runProcess method of pipelineManger by passing processorchaninId .
Extending DAF/DAS pipeline and commerce pipeline are not same
we can create our own custom servlets to put it in DAF/DAS pipeline .
extend you own servlet class wither with PipelineableServletImpl or InsertableServletImpl
and re-write the service method depending on what you want to do.Further details are widely available on the internet :)
and coming to commerce pipeline
Commerce pipeline is defined in an xml file located in /B2CCommerce/config/atg/commerce/commercepipeline.xml.PipeLine manager is responsible load the pipeline definition xml and initialize the pipeline chains. Write your processor class. Custom Processor class should be an implementation of PipelineProcessor.
extend your own class by PipelineProcessor and re-write runProcess method.You also have to create respective .properties file for your processor.And then in
B2CCommerce/config/atg/commerce/commercepipeline.xml
<pipelinechain name=" lastExistingchain" transaction="TX_REQUIRED" headlink=" lastExistinglink">
……..
<transition returnvalue="1" link=" sampleDemoLink"/>
</pipelinelink>
<pipelinelink name="sampleDemoLink" transaction="TX_REQUIRED">
<processor jndi="demo/atg/order/processor/MyProcessor"/>
</pipelinelink>
</pipelinechain>
restart ATG server.
Coming to you other question that if we can create our own pipeline manager
Answer is yes.
Just create /atg/registry/PipelineRegistry/ .properties file in your local config folder.PipelineRegistry is a service where all pipeline managers are registered
this service has property called pipelineManagers just append your pipeline manager component to this property.if you want to use existing commercePipelineManager class but with different bunch of processors executing one aftr the other.create a definition xml file which looks something like this
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE pipelinemanager
PUBLIC "-//Art Technology Group, Inc.//DTD Dynamo Pipeline Manager//EN"
'http://www.atg.com/dtds/pipelinemanager/pipelinemanager_1.0.dtd'>
<pipelinemanager>
<!-- This chain updates (saves) an Order to the repository -->
<pipelinechain name="updateOrder" transaction="TX_REQUIRED" headlink="updateOrderObject">
<pipelinelink name="updateOrderObject" transaction="TX_MANDATORY">
<processor jndi="/atg/commerce/order/processor/SaveOrderObject"/>
<transition returnvalue="1" link="updateCommerceItemObjects"/>
</pipelinelink>
<pipelinelink name="updateCommerceItemObjects" transaction="TX_MANDATORY">
<processor jndi="/atg/commerce/order/processor/SaveCommerceItemObjects"/>
<transition returnvalue="1" link="updateShippingGroupObjects"/>
</pipelinelink>
<pipelinelink name="updateShippingGroupObjects" transaction="TX_MANDATORY">
<processor jndi="/atg/commerce/order/processor/SaveShippingGroupObjects"/>
<transition returnvalue="1" link="updateHandlingInstructionObjects"/>
</pipelinelink>
<pipelinelink name="updateHandlingInstructionObjects" transaction="TX_MANDATORY">
.......
.......
<pipelinechain name="rejectQuote" transaction="TX_REQUIRED" headlink="quoteRejection">
<pipelinelink name="quoteRejection" transaction="TX_MANDATORY">
<processor jndi="/atg/commerce/order/processor/RejectQuote"/>
</pipelinelink>
</pipelinechain>
<!-- This pipeline chain should be called when a requested quote is to be completed -->
<pipelinechain name="completeQuote" transaction="TX_REQUIRED" headlink="completeQuoteRequest">
<pipelinelink name="completeQuoteRequest" transaction="TX_MANDATORY">
<!-- this is a dummy processor that should be extended to save quote details -->
<processor jndi="/atg/commerce/order/processor/CompleteQuoteRequest"/>
</pipelinelink>
</pipelinechain>
</pipelinemanager>
here you can mention your custom processors.
As you have registered your new pipelinemanager in pipeline registry.It gets automatically initialized.So if you do any operation in JSP related to your pipeline,in background all the processing gets done.
There are two entirely separate things in ATG commonly known as pipelines.
They are not at all related to one another.
1. The Servlet Pipeline
This is a chain of servlets through which all requests pass before hitting your custom code (JSP page, Form Handler, Droplet or anything else). The purpose of this pipeline is to decorate the incoming request to provide the request with the context that the ATG framework needs to, for example, associate the request to a Dynamo Session and to load a User Profile. The DAS module defines the basic servlet pipeline and various modules add additional servlets into it.
You modify this pipeline by changing the nextServlet or related properties of existing components, and by creating Nucleus components that are instances of classes extended from PipelineableServlet
You would choose to customise this if you wanted to perform an action or make a decision on each incoming request - somewhat similar to what you would use Filters in standard J2EE web applications or Interceptors in a Spring MVC application.
You can see what is defined in your servlet pipeline by looking the /atg/dynamo/servlet/dafpipeline/DynamoHandler component in the Dynamo Admin interface.
2. Processor Chains
A processor chain is a way of executing discreet processes and linking them together based on the outcome (resulting status code) of each process. There is a component in the Nucleus called the Pipeline Manager which holds the configuration of each Processor Chain defined in the system, and manages the execution of these chains.
Processor chains are used by the Commerce module to manage, for example, the creation of an order to ensure referential integrity between all the constituent parts. This is sometimes called The Commerce Pipeline.
Processor chains are conceptually more related to scenarios, and completely unrelated to the servlet pipeline.
You modify a processor chain by creating or editing the XML file that defines the chain. You create new processors by creating a Nucleus component from a class that implements the PipelineProcessor interface.
You can see which chains are defined by looking at the PipelineManager component in the Dynamo admin interface. There is also a graphical Pipeline Editor in the ACC.
You would modify an existing pipeline chain if you want to add or remove a step in it.
You would create a new chain if you wanted to split a long and complex process - usually repository operations - into multiple discrete steps to be sequenced together by an externalised process flow.

How can I call Sling Filter before AuthenticationHandler?

I want to put a sling filter before the authentication handler, but I have no luck.
From the logs I can see that the authandler always called after my filter. Is there a good documentation about this? Is it possible to put a filter before the authenticationhandler?
Both works when I put logging to the authandler's extractCredentials method and to the doFilter method of Filter. But unfortunately my Filter is called after the authandler.
Here is my logs:
11:50:55.924 AuthenticationHandler extractCredentials
11:50:56.004 Before chain.doFilter
11:50:56.332 After chain.doFilter
Authentication is always done before the filter processing:
Request level
Authentication
Resource Resolution
Servlet/Script Resolution
Request Level Filter Processing
(source: Sling documentation).
So, you can't create a filter that would be run before the authentication.
You can use an OSGI Preprocessor, it will act as a filter before authentication is called, you will find the specification and one example here:
https://docs.osgi.org/specification/osgi.cmpn/7.0.0/service.http.whiteboard.html#service.http.whiteboard.servlet.preprocessors

Fiddler: how to make a subnet-based filter?

I would like Fiddler2's host filter to keep sessions to all hosts on 192.168.2.*
The docs say that e.g. fiddler2.com filter would catch all *.fiddler2.com but I can't figure how to do the same kind of filtering for IP subnets instead of hostnames.
Is there a specific syntax? Should I use a custom rule?
TIA.
You need to write a custom rule. Inside OnBeforeResponse, look at the m_hostIP property on the session and use that to set the UI-HIDE flag if it doesn't match the site you care about.