Delta imports in Oracle ATG commerce - atg

Is there an Out Of the Box way to import daily product/sku delta data into ATG CA for subsequent deployment into production target? How is this done normally as this would be a requirement in every project.
I know about the repository loader scripts but not sure if that fits the bill as I only want deltas

As of ATG 10.2.0.2, No. APIs exist for the programmatic creation of projects and querying current data; however custom code needs to be written to make this work.

Related

Fetch all metadata of Salesforce

I've been trying to implement a way to download all the changes made by a particular user in salesforce using PowerShell script & create a package The changes could be anything whether it can be added or modified, Apex classes, profiles, Account, etc based on the modified by the user, component ID, timestamp, etc. below is the URL that exposes the API. The URL Does not explain any way to do this by using a script.
https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/meta_listmetadata.htm
Does anyone know how I can implement this?
Regards,
Kramer
Salesforce orgs other than scratch orgs do not currently provide source tracking, which makes it possible to pinpoint user changes in metadata and extract only those changes. This is done by an SFDX/Metadata API client, like Salesforce DX or CumulusCI (disclaimer: I'm on the CumulusCI team).
I would not try to implement a Metadata API client in PowerShell; instead, harness one of the existing tools to do so.
Salesforce orgs other than scratch orgs don't provide source tracking at present. To identify user changes, you can either
Attempt to extract all metadata and diff it against your version control, which is considerably harder than it sounds and is implemented by a variety of commercial DevOps tools for Salesforce (GearSet, Copado, etc).
Have the user manually add components to a Change Set or Unmanaged Package, and use a Metadata API client as above to retrieve the contents of that package. (Little-known fact, a Change Set can be retrieved as a package!)
To emphasize: DevOps on Salesforce does not work like other platforms. Working on the Metadata API requires a fair amount of time investment and specialization. Harness the existing work of the Salesforce community where you can, but be aware that the task you are laying out may be rather more involved than you think and it's not necessarily something you can just throw together from off-the-shelf components.

Real time access to attributes in ATG/Endeca integration

We are working with ATG/Endeca integration 11.2. One of the client requirements is a near real time access to some of the attributes from the Product Catalog. However, this isn't possible since baseline has to be run before the Endeca index can be updated.
Is the only solution to query ATG directly using droplets for the attributes that are critical or is there any other way to accomplish this? We are pushing the product catalog to CAS using ProductCatalogSimpleIndexingAdmin(no Forge is being used).

How to create/insert product programmatically in Websphere Commerce 7 WCS7

I'm developing an ecommerce based on Websphere Commerce 7 WCS7. I need to import products from an external supplier, which is exposing a webservice. I've already implemented a Controller Command performing all the operation needed to extract the products from the remote service, and I've them avalaible as custom Java classes.
I'm a little bit confused about the approach I should follow in this case. I've defined the attributes needed in my scenario and used the dataload utility to import them in the DB. What should I do next? I expect to be able to "create" WCS product programmatically from my Controller Command but I don't know how to use the attribute I've defined in a programmatic insert.
Can someone point me on the right track on how to perform this kind of operation? I went through the documentation, but, given the fact I'm quite new of the WCS environment, I don't know how to proceed according to the current best practices.
It is possible to create a new catalog entry programmatically if you copy what is being done in the LOBTools. I have not done this myself though. I have always added new products via the data loads and when we did need to add from an external service I just output the information to a file and loaded along with our other products. The reason was due to keeping the catalog in sync with the product management system.
Have a look at the various DataBean classes in WCS, like: CatalogEntryDataBean.
See here for WCS data beans:Link
And here for "activation" of a DataBean:Link

custom export/Import for alfresco

the obvious question is that is there any solution to export some alfresco contents which have a custom condition, for example export files which their create date is between a given date range?
the goal of this solution is:
1- to have a minimum mount of data volume in export/import action
2- in my weekly or monthly export/import action on backup alfresco server, I shouldn't have duplicate records for import action
thanks a lot for any kind of help
One idea is to use a library like OpenCMIS (Java) or cmislib (Python), both available from the Apache Chemistry project. Then use a CMIS query to restrict the data you want to export to a certain date range. If you want examples of CMIS queries, including ones that use date ranges, take a look at this Java example.
Another idea would be to use CMIS change tokens. Using this approach, you ask Alfresco what has changed since the last time your code ran. Alfresco responds back with a set of changes. You can then iterate over those changes and process them accordingly. The CMIS & Apache Chemistry in Action book has a change token example that uses Python to run a polling sync server between to CMIS repositories. The source code lives here.
Both of these options use CMIS. If you would rather have a native Alfresco option you could write a custom action that runs on a schedule to call the export. Or, you could use the File Transfer Service to write files to a file system on a schedule.
If what you are really trying to do is back up your repository, don't use any of these options. Instead you should be following standard practice for backup up the repo which is to dump the database and backup the content store.
Maybe you can use the Alfresco Replication Jobs to export your contents into a different repository.
In addition, you can export the contents to a file system using the FSTR feature.
Replication jobs use Alfresco Transfer Services that can be customised to only transfer some kind of content.

ASP.MVC2.0/EF4.0 site deployment/maintenance

My small team used asp.mvc 2.0/entity framework 4.0(model first approach)/Windows Server 2008r2/Sql Server 2008 r2 stack in out web site project. We've already complete developing process, and come to the web deployment stage. In this stage we are faced with the problem - ok we'll use vs2010 features for initial server/db deploy, but what we'll do in the future? Obviously some of our models can be modified after publishing in order to satisfy new conditions, and of course our server db will contains users data sets, articles etc. Is there any approach to update servers db with new db modification, without dropping db, and converting data from old instance to the new one?
Now we have found only DAC/DACPAC approach to update server db, but we don't know how to bind auto EF model generation with DAC.
May be there is exists another solution? Is there any standard way to resolve this kind of situation? Any advice?
Thanks
I'd be interested to know if you have found a solution to this yet?
Have you tried simply generating a database based on your EF model, and using a schema comparison tool such as SQL Compare to deploy changes from the EF-generated database and your target production server?