[WCS 7, FixPack 7, Feature Pack 6]
I need to generate a feed with certain catalog entry (product) information such as name, image(s), category, price, seo-url, descriptive attributes and so on.
To generate this feed i will use the scheduler framework and created a controller command which will be invoked after a certain time has passed.
Now i need to receive the catalog entry data of each item in a specific store/catalog.
As far as i know there are several ways to achieve this:
Access Beans
SOA (new Access Profile with specific SQL)
I tried to use Access Beans to get all necessary information but I got stuck with seo-url, price etc.
Are there more ways and what is the best practice to get a specific set of catalog entry attributes?
I suggest you study/investigate in your WCS Development environment how site map generation schedule command is implemented (SitemapGenerateCmd) and how it is calling the JSP template file (sitemap.jsp)
you need to modify a bit in your command to create a jsp template for your feed and call that template from your scheduler command you've created .
calling template command for messaging system, make sure to use following properties in ActionForward tag to register the JSP for messaging from back-end:
example:
<forward className="com.ibm.commerce.struts.ECActionForward"
name="SitemapIndexView" path="/SitemapIndex.jsp">
<set-property property="direct" value="true"/>
<set-property property="resourceClassName" value="com.ibm.commerce.messaging.viewcommands.MessagingViewCommandImpl"/>
<set-property property="interfaceName" value="com.ibm.commerce.messaging.viewcommands.MessagingViewCommand"/>
<set-property property="properties" value="storeDir=no"/>
<set-property property="implClassName" value="com.ibm.commerce.messaging.viewcommands.MessagingViewCommandImpl"/>
</forward>
then the logic for extracting data from your product/catalog beans will be handled inside the JSP and you can easily form the output data as you want (XML, CSV, JSON .. etc)
the advantages and benefits of using this way are you can leverage Commerce Server OOTB JSTL tags , WCF tags for retrieving all information and using wcf:url for SEO URLS OOTB even you can call BOD/SOA commands using <wcf:getData tag and finally your will get more structured design that can easily maintain and reuse in future .
sitemap.jsp is a good resource for you how to iterate through catalog and sub-catalog to extract product info.
hope that those help you find your solution. need some search and understanding of existing sitemap generation utility .
Thanks.
Abed
Related
I am currently trying to work with the SuiteTalk 2017_2_0 API for a new integration with NetSuite. I have all the basics rolling of single record retrieval, etc.; however, I am having a problem trying to figure out how to list all of a given object/type in the system.
Example: I want to list ALL InventoryItem data.
Not sure what I am missing. Does anyone have a sample SOAP doc?
What I did so far
I have been working primarily with types: Sales Order, Customer and Inventory Item.
Tried using the getAll call defined in the WSDL - only supports limited fields:
budgetCategory
campaign*
currency
etc.
Tried using getList but a set of internal IDs are required in the baseRef/RecordRef (INVALID_KEY_OR_REF - The specified key is invalid.)
Fails:
<urn:getList>
<urn1:baseRef xsi:type="core:RecordRef" type="salesOrder" />
</urn:getList>
Succeeds:
<urn:getList>
<urn1:baseRef xsi:type="core:RecordRef" type="salesOrder" internalId="1" />
<urn1:baseRef xsi:type="core:RecordRef" type="salesOrder" internalId="2" />
</urn:getList>
Tried formulating a search that would return all the data but the types I need are unavailable (or I haven't been able to figure it out)
Define a saved search that pulls the appropriate data, then call the saved search.
I'm using REST to get data from a SharePoint library to display using the DataTables jQuery API. Everything is working fine, but I've got one stubborn field in this library that REST isn't able to grab.
The name of the field is "For", but the internal name is _x0046_or1 (not sure why, I didn't create the library). I've double-checked that this is the correct internal name using REST and by checking the code for my library view in Designer.
So using my REST call:
/_api/Web/Lists/GetByTitle('SAS2')/items?$select=_x0046_or1&$top=5000
And I get back:
The field or property '_x0046_or1' does not exist.
Anybody have any suggestions for a different way to reference this field that the REST api might recognize?
I did as Rohit suggested in the comments, and made the REST call without the select. It turns out that the actual internal name of the For field was "OData__x0046_or1". No idea why.
Thanks for the helpful suggestion, Rohit!
In SharePoint 2013 if you name a field with 3 or less chars and then end it with a number, SP will rename the internal name.
We have a backend built on FOSRestBundle (JMSSerializer) based on Symfony2/Doctrine2.
Our API e.g. delivers for some entity something like: 'valid_from': '2015-12-31' in a REST/JSON response.
Our AngularJS frontend consumes this REST API and e.g. presents some form to the user:
Valid from: 31.12.2015
Now I am wondering what's the best way to have a central/or maintainable place for a mapping like:
an English label for field key 'valid_from' is 'Valid from'
To my understanding all Translatable extensions for Doctrine (like Gedmo Kpn etc.) are to translate content (e.g. 'category' = 'Product' in English and = 'Produkt' in German) but not schema.
And of course I cannot change/adapt the field key to the language because it's the identifier for the frontend.
What I thought so far:
Use some mapping file between field_key, language_key and
translation on frontend.
Issue here is that the frontend needs to know a lot and ideally any change in the API can be done thoroughly by the backend developers.
Use some mapping file between field_key, language_key and
translation on frontend.
As we use annotations for our Entity model and JSMSerializer I'd need to open a totally new space for just translation information. Doesn't sound too convincing.
Create custom annotation with properties
Current idea is to use custom annotation (which would be cacheable and could be gathered and provided as one JSON e.g. to the frontend) with my entity properties.
That means any backend developer could immediately check for translation keys at least as soon as the API changes.
Something like:
#ORM\Column(name='product', type='string')
#FieldNameTranslation([
['lng'=> 'de-DE'],['text']=>'Product'],
['lng'=> 'en-En'],['text']=>'Produkt'])
A further idea could be just to provide some translation key like:
#FieldNameTranslation(key='FIELD_PRODUCT')
and use Symfony's translation component and have the content in translation yml files.
What is your opinion? Is there any 'best in class' approach for this?
I have some idea on atg droplets, dsp tags and writing custom droplets. I would like to know about pipelines on ATG topics. When I trying to refer oracle documentation for this I'm getting bit confused with understanding what it is and working flow of it. Can I create one custom pipeline manager which executes my custom processors sequentially.If possible how can I do this?? How to trigger my pipeline manager from my jsp page. Please guide me some tutorials or online documents for best learning for pipelines.
Code snippets is highly preferable.
Thanks in advance
A pipeline is an execution mechanism that allows for modular code execution. Oracle ATG Web Commerce uses pipelines to execute tasks such as loading, saving, and checking out Orders .The PipelineManager implements the pipeline execution mechanism.
There are two request-handling pipelines used by Dynamo.
• DAF Servlet Pipeline - It is used to handle the JSP request.
• DAS Servlet pipeline - It is used to handle JHTML request. Because JHTML is a proprietary language, it relies on the page compiler provided in the DAS servlet pipeline to generate JHTML into a servlet that’s rendered as HTML by the application server.
And also there is something called commercePipeline which takes care of order processing.
Request-Handling pipelines and commerce pipelines works in different ways.
DAS/DAF(ie., request pipelines)
It's a configuration defined with a series of servlets executed in a sequence on basis of each servlet's output. One of Dynamo's most important tasks is handling HTTP requests. In handling these requests, Dynamo uses session tracking, page compilation, Java Server Pages, and other powerful extensions to the basic Web server model.Request handling can usually be broken down into a series of independent steps. Each step may depend on additional information being available about the request, so order does matter. However, the individual steps are separable. For example, a typical request might go through these steps:
1) Compare the request URI against a list of restricted directories, to make sure that the user has permission to access the specified directory.
2) Translate the request URI into a real file name, taking "index" files into account when the file name refers to a directory.
3) Given the file name's extension, determine the MIME type of the file.
4) From the MIME type, dispatch the request to the appropriate handler.
So DAF/DAS pipelines comes into picture when there is a request. In atg_bootstrap.war web.xml has the information about the server startup.
When the server starts NucleusServlet.java gets loaded in the app server. This class initializes nucleus and other components and then adds all of them to nucleus namespace. And when a web application is accessed (DynAdmin,CRS,MotopriseJSP), nucleus routes the flow to either daf/das pipeline. If the application MIME type is Jhtml then das pipeline processes the request further. It is routed to DynamoProxyServlet class where it does the further processing by calling list of servlets. And if it is .jsp then Daf pipeline handles the further requests by calling PageFilter class.The reason for using a filter but not a servlet to invoke DAF pipeline is:
JSP pages and fragments are handled by the application server meaning that JBoss, WebLogic , WebSphere is the one responsible for compiling and executing the resulting page code. The best way to hook into this process is by using a Filter. For JHTML pages, that's a different story since the app server (not all app servers)can't parse and compile the pages. A servlet is used to redirect the request down the DAS pipeline where the page can parsed and executed by the ATG page compilationmechanism.
In case of Commerce Pipeline:
Pipeline Manager implements commerce pipeline functionality by reading a pipeline definition file ie., commercepipeline.xml. When application is deployed ,Nucleus initializes pricing engine, where OrderManager initializes pipelineManager. OrderManager.processOrder method invokes the pipeline chains in commercepipeline.xm. A pipeline chain would have processors which are simple java classes doing small operations .This xml can be extended by adding a custom processor. But in cases where a single processor need to be called, call runProcess method of pipelineManger by passing processorchaninId .
Extending DAF/DAS pipeline and commerce pipeline are not same
we can create our own custom servlets to put it in DAF/DAS pipeline .
extend you own servlet class wither with PipelineableServletImpl or InsertableServletImpl
and re-write the service method depending on what you want to do.Further details are widely available on the internet :)
and coming to commerce pipeline
Commerce pipeline is defined in an xml file located in /B2CCommerce/config/atg/commerce/commercepipeline.xml.PipeLine manager is responsible load the pipeline definition xml and initialize the pipeline chains. Write your processor class. Custom Processor class should be an implementation of PipelineProcessor.
extend your own class by PipelineProcessor and re-write runProcess method.You also have to create respective .properties file for your processor.And then in
B2CCommerce/config/atg/commerce/commercepipeline.xml
<pipelinechain name=" lastExistingchain" transaction="TX_REQUIRED" headlink=" lastExistinglink">
……..
<transition returnvalue="1" link=" sampleDemoLink"/>
</pipelinelink>
<pipelinelink name="sampleDemoLink" transaction="TX_REQUIRED">
<processor jndi="demo/atg/order/processor/MyProcessor"/>
</pipelinelink>
</pipelinechain>
restart ATG server.
Coming to you other question that if we can create our own pipeline manager
Answer is yes.
Just create /atg/registry/PipelineRegistry/ .properties file in your local config folder.PipelineRegistry is a service where all pipeline managers are registered
this service has property called pipelineManagers just append your pipeline manager component to this property.if you want to use existing commercePipelineManager class but with different bunch of processors executing one aftr the other.create a definition xml file which looks something like this
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE pipelinemanager
PUBLIC "-//Art Technology Group, Inc.//DTD Dynamo Pipeline Manager//EN"
'http://www.atg.com/dtds/pipelinemanager/pipelinemanager_1.0.dtd'>
<pipelinemanager>
<!-- This chain updates (saves) an Order to the repository -->
<pipelinechain name="updateOrder" transaction="TX_REQUIRED" headlink="updateOrderObject">
<pipelinelink name="updateOrderObject" transaction="TX_MANDATORY">
<processor jndi="/atg/commerce/order/processor/SaveOrderObject"/>
<transition returnvalue="1" link="updateCommerceItemObjects"/>
</pipelinelink>
<pipelinelink name="updateCommerceItemObjects" transaction="TX_MANDATORY">
<processor jndi="/atg/commerce/order/processor/SaveCommerceItemObjects"/>
<transition returnvalue="1" link="updateShippingGroupObjects"/>
</pipelinelink>
<pipelinelink name="updateShippingGroupObjects" transaction="TX_MANDATORY">
<processor jndi="/atg/commerce/order/processor/SaveShippingGroupObjects"/>
<transition returnvalue="1" link="updateHandlingInstructionObjects"/>
</pipelinelink>
<pipelinelink name="updateHandlingInstructionObjects" transaction="TX_MANDATORY">
.......
.......
<pipelinechain name="rejectQuote" transaction="TX_REQUIRED" headlink="quoteRejection">
<pipelinelink name="quoteRejection" transaction="TX_MANDATORY">
<processor jndi="/atg/commerce/order/processor/RejectQuote"/>
</pipelinelink>
</pipelinechain>
<!-- This pipeline chain should be called when a requested quote is to be completed -->
<pipelinechain name="completeQuote" transaction="TX_REQUIRED" headlink="completeQuoteRequest">
<pipelinelink name="completeQuoteRequest" transaction="TX_MANDATORY">
<!-- this is a dummy processor that should be extended to save quote details -->
<processor jndi="/atg/commerce/order/processor/CompleteQuoteRequest"/>
</pipelinelink>
</pipelinechain>
</pipelinemanager>
here you can mention your custom processors.
As you have registered your new pipelinemanager in pipeline registry.It gets automatically initialized.So if you do any operation in JSP related to your pipeline,in background all the processing gets done.
There are two entirely separate things in ATG commonly known as pipelines.
They are not at all related to one another.
1. The Servlet Pipeline
This is a chain of servlets through which all requests pass before hitting your custom code (JSP page, Form Handler, Droplet or anything else). The purpose of this pipeline is to decorate the incoming request to provide the request with the context that the ATG framework needs to, for example, associate the request to a Dynamo Session and to load a User Profile. The DAS module defines the basic servlet pipeline and various modules add additional servlets into it.
You modify this pipeline by changing the nextServlet or related properties of existing components, and by creating Nucleus components that are instances of classes extended from PipelineableServlet
You would choose to customise this if you wanted to perform an action or make a decision on each incoming request - somewhat similar to what you would use Filters in standard J2EE web applications or Interceptors in a Spring MVC application.
You can see what is defined in your servlet pipeline by looking the /atg/dynamo/servlet/dafpipeline/DynamoHandler component in the Dynamo Admin interface.
2. Processor Chains
A processor chain is a way of executing discreet processes and linking them together based on the outcome (resulting status code) of each process. There is a component in the Nucleus called the Pipeline Manager which holds the configuration of each Processor Chain defined in the system, and manages the execution of these chains.
Processor chains are used by the Commerce module to manage, for example, the creation of an order to ensure referential integrity between all the constituent parts. This is sometimes called The Commerce Pipeline.
Processor chains are conceptually more related to scenarios, and completely unrelated to the servlet pipeline.
You modify a processor chain by creating or editing the XML file that defines the chain. You create new processors by creating a Nucleus component from a class that implements the PipelineProcessor interface.
You can see which chains are defined by looking at the PipelineManager component in the Dynamo admin interface. There is also a graphical Pipeline Editor in the ACC.
You would modify an existing pipeline chain if you want to add or remove a step in it.
You would create a new chain if you wanted to split a long and complex process - usually repository operations - into multiple discrete steps to be sequenced together by an externalised process flow.
I'm trying to build a REST service in a Sitecore root. My application start looks like this:
void Application_Start(object sender, EventArgs e)
{
RouteTable.Routes.MapHttpRoute(
name: "DefaultApi", routeTemplate: "api/{controller}/{id}", defaults: new { id = System.Web.Http.RouteParameter.Optional });
}
And my URL looks like this:
http://{mydomain}/api/books
I have the correct controller and all that.
But Sitecore keeps redirecting me to the 404 page. I've added the path to the IgnoreUrlPrefixes node in the web.config, but to no avail. If I had to guess, I'd think that Sitecore's handler is redirecting before my code gets the chance to execute, but I really don't know.
Does anybody have any idea what might be wrong?
Your assessment is correct. You need a processor in the httpRequestBegin pipeline to abort Sitecore's processing. See the SystemWebRoutingResolver in this answer:
Sitecore and ASP.net MVC
It's also described in this article:
http://www.sitecore.net/Community/Technical-Blogs/John-West-Sitecore-Blog/Posts/2010/10/Sitecore-MVC-Crash-Course.aspx
But I'll include the code here as well. :)
public class SystemWebRoutingResolver : Sitecore.Pipelines.HttpRequest.HttpRequestProcessor
{
public override void Process(Sitecore.Pipelines.HttpRequest.HttpRequestArgs args)
{
RouteData routeData = RouteTable.Routes.GetRouteData(new HttpContextWrapper(args.Context));
if (routeData != null)
{
args.AbortPipeline();
}
}
}
Then in your httpRequestBegin configuration:
<processor type="My.SystemWebRoutingResolver, My.Classes" />
You might want to have a look at Sitecore Web Api
It's pretty much the same you are building.
Another option, which I've used to good effect, is to use the content tree, the "star" item, and a sublayout/layout combination dedicated to this purpose:
[siteroot]/API/*/*/*/*/*/*/*/*/*
The above path allows you to have anywhere between 1 and 9 segments - if you need more than that, you probably need to rethink your process, IMO. This also retains all of the Sitecore context. Sitecore, when unable to find an item in a folder, attempts to look for the catch-all star item and if present, it renders that item instead of returning a 404.
There are a few ways to go about doing the restful methods and the sublayout (or sublayouts if you want to segregate them by depth to simplify parsing).
You can choose to follow the general "standard" and use GET, PUT, and POST calls to interact with these items, but then you can't use Sitecore Caching without custom backend caching code). Alternately, you can split your API into three different trees:
[siteroot]/API/GET/*/*/*/*/*/*/*/*/*
[siteroot]/API/PUT/*/*/*/*/*/*/*/*/*
[siteroot]/API/POST/*/*/*/*/*/*/*/*/*
This allows caching the GET requests (since GET requests should only retrieve data, not update it). Be sure to use the proper caching scheme, essentially this should cache based on every permutation of the data, user, etc., if you intend to use this in any of those contexts.
If you are going to create multiple sublayouts, I recommend creating a base class that handles general methods for GET, PUT, and POST, and then use those classes as the base for your sublayouts.
In your sublayouts, you simply get the Request object, get the path (and query if you're using queries), split it, and perform your switch case logic just as you would with standard routing. For PUT, use Response.ReadBinary(). For POST use the Request.Form object to get all of the form elements and iterate through them to process the information provided (it may be easiest to put all of your form data into a single JSON object, encapsulated as a string (so .NET sees it as a string and therefore one single property) and then you only have one element in the post to deserialize depending on the POST path the user specified.
Complicated? Yes. Works? Yes. Recommended? Well... if you're in a shared environment (multiple sites) and you don't want this processing happening for EVERY site in the pipeline processor, then this solution works. If you have access to using MVC with Sitecore or have no issues altering the pipeline processor, then that is likely more efficient.
One benefit to the content based method is that the context lifecycle is exactly the same as a standard Sitecore page (logins, etc.), so you've got all the same controls as any other item would provide at that point in the lifecycle. The negative to this is that you have to deal with the entire page lifecycle load before it gets to your code... the pipeline processor can skip a lot of Sitecore's process and just get the data you need directly, making it faster.
you need to have a Pipeline initializer for Routing:
It will be like :
public class Initializer
{
public void Process(PipelineArgs args)
{
RouteCollection route = RouteTable.Routes;
route.MapHttpRoute("DefaultApi", "api/{controller}/{action}/{id}",
new { id = RouteParameter.Optional });
}
}
On config file you will have :
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/">
<sitecore>
<pipelines>
<initialize>
<processor type="_YourNameSpace.Initializer,_YourAssembly" />
</initialize>
</pipelines>
</sitecore>
</configuration>
Happy coding