I know that you can publish individual page components, such as DCRs and graphics, through a TeamSite workflow, but does anyone know if it's possible to publish page metadata?
If your question is about ability to deploy the Metadata to Runtime Livesite Severs, YES you can.
How ?
You need to have the Livesite Content Services ( LSCS Runime ) instance installed on Runtime server which will store the deployed Metadata and content.
Pre-requisite is to Index the Metadata in Teamsite Authoring server.( Search Server ).
Then use the OOB PLC ( Publish Livesite Content ) Workflow to deploy both Static ( to LSDS Server ) and Metadata to ( LSCS Store/Runtime ) based on configurations you have defined.
Once you have the Metadata deployed to LSCS you can query the Metadata / EA to build dynamic content experience ( Ex: Return me all content for the Year 2014 ). So you can aggregate and segregate based on your business need.
There is indeed an author submit with metadata workflow out of the box in your /TeamSite/local/config/wft/solutions folder.
Related
I've created a process to be able send email to the user on order confirmation.
The problem is that on the DEV environment everything goes well but when I did a deploy to UAT server
I got an exception during the task execution ( " Media not found (requested media location: hf0/h27/8861015965726.bin) ").
Any Ideas what could be happening ?
How can this issue be resolved and what causes this issue.
hybris creates emails using Velocity templates. Those Velocity Templates are stored as Medias on the hybris Servers. hybris Medias consist of two parts: an entry in the respective table in the database and a file on the hard drive. The database entry stores metadata about that media while the file stores the actual content.
Now what hybris is telling you, is that the file on the hard drive is missing. The database entry directs to a file that is not existing. There could be a lot of reasons why that file is missing:
It was deleted during deployment.
It wasn't created during deployment.
The hybris server has no access/access rights to that directory.
In a clustered environment the file could have been stored on another node and is not accessible on the current node.
Media could be the email itself as Johannes stated, but it can also be a part of the email, an image set from the CMS cockpit for example.
To fix this issue you have to master your impex flows.
First be sure that impex contain all the data needed to create properly the email.
Then know what is imported when you deploy and update your system.
Be sure that mandatory files are imported during initialization.
Be sure that data that can be managed by webmasters are not reset by impex during update.
If a data is created during the update because init is already done then be sure that is won't be played after each update.
As the media file is not found, you can
1. go to hmc-->Multimedia-->Media, in search panel,
2. click "search additional attributes" dropdown box, select "PK of file"
3. use "8861015965726" as PK of file to search
Then you can find out what file is missing and you can import impex or upload using hmc to fix this problem.
As Earlier Posted a thread for syncing Data from Premises Mysql to Azure SQL over here referring this article, and found that lookup component for watermark detection is only available for SQL Server Only.
So tried a work Around, that while using "Copy" Data Flow task ,will pick data greater than last watermark stored from Mysql.
Issue:
Able to validate package successfully but not able to publish same.
Question :
In Copy Data Flow Task i'm using below query to get data from MySql greater than watermark available.
Can't we use Query like below on other relational sources like Mysql
select * from #{item().TABLE_NAME} where #{item().WaterMark_Column} > '#{activity('LookupOldWaterMark').output.firstRow.WatermarkValue}'
CopyTask SQL Query Preview
Validate Successfully
Error With no Details
Debug Successfully
Error After following steps mentioned by Franky
Azure SQL Linked Service Error (Resolved by re configuring connection /edit credentials in connection tab)
Source Query got blank (resolved by re-selection source type and rewriting query)
Could you verify if you have access to create a template deployment in the azure portal?
1) Export the ARM Template: int he top-right of the ADFv2 portal, click on ARM Template -> Export ARM Template, extract the zip file and copy the content of the "arm_template.json" file.
2) Create ARM Template deployment: Go to https://portal.azure.com/#create/Microsoft.Template and log in with the same credentials you use in the ADFv2 portal (you can also get to this page going in the Azure portal, click on "Create a resource" and search for "Template deployment"). Now click on "Build your own template in editor" and paste the ARM template from the previous step in the editor and Save.
3) Deploy template: Click on existing resource group and select the same resource group as the one where your Data Factory is. Fill out the parameters that are missing (for this testing it doesn't really matter if the values are valid); Factory name should already be there. Agree the terms and click purchase.
4) Verify the deployment succeeded. If not let me know the error, it might be an access issue which would explain why your publish fails. (ADF team is working on giving a better error for this issue).
Did any of the objects publish into your Data Factory?
I am using sitecore 7.2 & I have installed "Web Forms for Marketers 2.4 rev. 141008" sucessfully,
now I created an new form have some fields (single-line,telephone,email) when i submit this form it through an error "We experienced a technical difficulty while processing your request. Your data may not have been correctly saved."
For Sitecore 7.2
In the App_config\Include\Sitecore.Forms.config
There you need to provide a Form database. <formsDataProvider ... >...
The installer place a SQL Server database in the (webroot)/Data folder. This DB (Sitecore_WebForms.bak) can you use (restore in SQL server)
For your CM you can also use webservices (for a create item action, run on master) this can set in the connectionStrings
The problem:
When we create a workflow on UAT/Stage environment we need to import it to Production environment. Then we need to change the environment URLs (Web Service calls and such) and email addresses.
Is it possible:
To store URLs and emails in some global configuration where the Nintex Workflow would pick it up so whenever we deploy the workflow again to production we wouldn't need to go to each step and edit its settings?
Got an email from Nintex Support:
Dear Jakub,
Thank you for your e-mail.
The only suggestion I have is to use Workflow Constants. You can
configure workflow constants in Central Administration > Application
Management > Manage Workflow Constants.
In your workflow you use a lookup reference which points to the
appropriate workflow constant. As long as the Workflow constant name
is the same in both environments the workflow will pick it up but the
Workflow constant will contain the relevant URL for the environment.
Hope this helps.
Kind Regards
Simon Muntz Software Support Engineer - Team Lead Nintex Workflow for
Everyone® | nintex.com
Alternatively, use a list on your site as a configuration settings table. Set up a list of name/value pairs, then at runtime your workflow can query the list by name and retrieve the required value.
In Jboss 5.1 the Profile Service does what the Deployment Service was doing in Jboss 4.x.In Jboss 4.x I was using the Deployment Service to create a datasource "on-the-fly" and I was wondering if I could do the same thing using the Profile Service (since Deployment Service doesn't exist any more in Jboss 5.x).
Does anyone know a practical guid on using ProfileService?
Thank you ,
Regards.
I don't know of any guide but I can provide you with my experience using the Profile Service and a few links to JBoss wiki pages on this topic. I'd like to post more links but the spam protection doesn't allow me to post more than two, but you should easily find the other pages in the wiki on the ProfileService. Don't be suprised in case you don't find much, there isn't more.
ProfileService ManagementView
http://community.jboss.org/wiki/ProfileServiceManagementView
ProfileService DeploymentTemplates
http://community.jboss.org/wiki/ProfileServiceDeploymentTemplates
There you'll find usefull information about the ProfileService but no detailed information is available in the jboss wiki as far as I can tell.
In order to create Datasources on the fly you can use the DeploymentTemplates (also for creating message queues and topics) The last link provides you with information on how to use the templates but not with all the template names and their properties. You can list them programatically though.
// Get all Templates
for(String template : mgtView.getTemplateNames())
{
System.out.println("=========================================");
System.out.println("Listing properties for template: "+template);
DeploymentTemplateInfo info = mgtView.getTemplate(template);
for(String prop : info.getProperties().keySet())
System.out.println("- "+prop);
}
In order to get the ManagementView (mgtView) from an external java programm you can use something similiar to this:
// set security policy
System.setProperty("java.security.policy", "<path_to_policy_file>");
System.setSecurityManager( new RMISecurityManager() ) ;
// set initial context properties
Hashtable<String, String> env = new Hashtable<String, String>();
env.put("java.naming.factory.initial", "org.jnp.interfaces.NamingContextFactory");
env.put("java.naming.provider.url","jnp://localhost:1099");
env.put("java.naming.factory.url.pkgs","org.jboss.naming:org.jnp.interfaces");
ctx = new InitialContext(env);
// login to JBoss
SecurityClient client = SecurityClientFactory.getSecurityClient();
client.setSimple("admin", "admin");
client.login();
// get ProfileService and ViewManager
ProfileService ps = (ProfileService) ctx.lookup("ProfileService");
mgtView = ps.getViewManager();
What you want to get is the Java Naming Conext (InitialContext). In order to do that you'll need a security policy (you can use the java.policy file which is located in JBOSS_HOME/server/SERVER_DIR/conf/),security manager and environment properties to get the context. The java.naming.provider.url specifies the location of the JBoss naming service (default port is 1099).
Usually you would have to authenticate at this point which is done with the SecurityClient.
Finally you can use the context to grap the ProfileService.
At this point most of the anoying stuff is done und you can start playing around.
getViewManager() returns the ViewManager with which you can create datasources on the fly and getDeploymentManager() will give you the DeploymentManager with which you can deploy, undeploy, start, stop applications and other deployments.
The libraries you'll need to do that are located in
JBOSS_HOME/client
JBOSS_HOME/lib
JBOSS_HOME/common/lib
I've read several times that including the jbossall-client.jar in the client directory should be enough but that's actually not true. You need libraries from all three directories as far as I can tell (couldn't do it without referencing all of them at least). I haven't figured out which exact jars you need though...
IMPORTANT: The ProfileService in Jboss 5 Community Edition has some bugs though which got fixed in JBoss 6. I'd either suggest using a newer JBoss version or the Enterprise Edition.