Facebook campaings structure changes - facebook

I'm using bulk import to Power Editor to create campaigns. How can I know, how my excel file should look like, when campaign structure changes ike it is said in these news: http://www.jonloomer.com/2014/08/19/new-facebook-campaign-structure/

There will be two extra columns "Targeting Level" and "Pricing Level" which determine if those settings live at ad or ad set level. If you don't include these columns, Power Editor will try to do it's best to guess the level correctly, i.e. if you're updating existing ad/ad set, it will keep targeting/pricing where it is; if you're creating new ad/ad set, it will set targeting/pricing at ad set level - if you already have access to the migration.
Good practice is to export the data after your ad account is migrated and use that spreadsheet for upload.

Related

Creating configuration pane for Azure DevOps extension

We have created a few extensions for Azure DevOps (0) that are pipeline extensions.
We are trying to create new extensions that can react to changes in work items of Azure Boards. The API's for interacting with changes in work items are fairly straight-forward, but we are struggling with configuration of the extension.
Essentially we need to allow the users to configure the extension on two levels
1) On "Organization level"
It should be possible for a user (Project Administrator) to configure parameters as "external system URL" etc. An example of this could be something like below mockup:
2) On "Project level"
For each project in Azure DevOps an admin should be able to configure parameters like "Enable/disable extension" or "External UID" etc. An example of this could be something like below mockup:
When the extension reacts to "Work item saved" it will query the parameters on both levels to figure out what to do.
My problem is: where the heck do I save this information? I could add a number of "custom fields" to the template in use, but since fields can only be added to work item types, it is really not ideal in any way.
Where can I save this information through the API's?
PS: Source code for our extensions are available as OSS (Apache license) here:
(0) https://bitbucket.org/projectum/
Thank you :-)
It turns out that Azure DevOps has a way to store data for extensions. It can store data on both Project Collection scope as well as User scope.
I think I will be able to use this to store the data I need. All I need now is to figure out where to put the UI that the user or admin will use to maintain this data.
https://learn.microsoft.com/en-us/azure/devops/extend/develop/data-storage?view=vsts
:-)
For #1, it somewhat looks like what you can configure under a pipeline's service connections. This is per project though and not at an organization level.
This might be easier to manage outside of an extension and instead just use a service hook to call some middle tier that accomplishes what you want.

Kentico Import Tool inconsistent/buggy when updating documents

I've had a number of problems using the provided Kentico Import Toolkit, namely when using the "Import new and overwrite existing pages" option to update my existing/already imported pages. I'm using a custom SQL query to import and have had a profile saved for each import I've needed (client has article based site so a few tables of similar information) to try and keep each as consistent as possible between imports.
Here's the problems I've encountered thus far (in no particular order):
the tool tries to guess which fields from the query correlates to the fields of the page type in Kentico for you, which is a nice idea, but seems poorly implemented. If I'm not very careful and reload the profiles every time I import I've had instances where fields changed inexplicably when testing imports because the tool thought it knew which field I wanted
this is more the problem when importing/reimporting multiple times in a session and choosing to go back and load the same profile (without reloading)
the NodeAlias field is only seemingly required on update/reimport rather than on initial import. I'm sure there's an internal cleaning of the document's title to generate a NodeAlias and this is generated fine when importing documents while NOT providing the NodeAlias. After importing the items initially and wishing to update however the NodeAlias is seemingly required as you'll get errors with text asking it be included. This implies to me that there's matching of the NodeAlias along with the given ID field, which should be fine in theory but isn't specifically mentioned anywhere in the tool as best I can tell.
I've had instances where reimporting items will change/strip their NodeAliasPath. I've gotten around this by specifically setting the NodeAliasPath (which only shows after selecting "Show Advanced Columns") but like NodeAlias path before it, I'd think the tool should be smart enough to know to keep the path if not specifically given for updated items.
it seems very odd that in order to match on ID for previous items you have to provide the name of the new column instead of the old one. My example: client was using just a field named 'id' and the new one is 'OriginalID' to clearly differentiate it from the Kentico derived ID fields. To match the items I have to use 'OriginalID' rather than 'id'
A couple of notes/niceties or potential updates along with the above:
it would be nice if there were some way to select if the page should
be published or not through a single query. Currently having the
"Automatically publish pages under workflow" toggle checked seems to always publish
the items. I have an instance where the client has old documents in
the provided DB dump that they don't want visible on the site but
want preserved in the DB if they change their mind later. Currently I
have to perform 2 imports, 1 for the unpublished and a second for the
published items, to accommodate this, which is quite cumbersome
I'll likely edit/add to this as I get responses. This isn't really a specific problem (as I managed a workaround to the NodeAliasPath stripping problem, which inspired this post initially) but more just me asking if these are bugs,if I'm not using the software as intended, etc.
You've stated all the problems you're having/experiencing and possibilities why they are happening but didn't ask a particular question. If you suspect they are bugs, then I'd go to directly to Kentico Support and report the issues there since these are things that have been part of the KIT for as long as I have worked with it.

use one visualization with multiple filters

Let's say I have a visualization that is displaying production line output over time. There are 6 production lines and I only want to display one production line on the visualization at a time. I can add a Production Line filter to accommodate this.
I want to create a dashboard view for each production line. So I want to create 6 dashboard views each with the same visualization filtered for a different value. However, I do not know how to do this without creating a copy of the visualization and dictating the exact Production Line filter.
Reasons for doing this:
I want to publish the specific dashboards and be able to embed the view into a SharePoint site (that is Production Line specific) and not require the user to filter the view each time
If I make a change to visualization, I want it to be transferred to all dashboard views for all Production Lines and do not want to make the same change to 6 visualizations.
Is this possible?
You can't do it with a dashboard. Try doing it with stories though. Create one story point for each production line. You can drop a single copy of your viz onto multiple story points, each with filters independent of each other, and any changes to the worksheet will be reflected in all six story points.
Yes it is possible. You could use a story as Sam M says, with 6 views of your dashboard each with a different filter setting,
Another choice that works in the embedded viz use case is to create a single dashboard, and apply a filter to all the worksheets on that dashboard. When using Tableau Server, you can supply the filter value as a query parameter in the URL (see the documentation)
After you test it out and are happy with the result, you can remove the filter control from the dashboard so it is no longer visible. You can still control the filter setting via the URL (or via the JavaScript API)
This allows you to adjust the viz that users see in your embedded view without making them manually select the filter setting, and also without creating six nearly identical dashboards.

is it possible to show contact number of current logged in user when artifact is created in tuleap

We are using the Tuleap 7.0 for Project management. We have a new requirement that we need to show the contact number of current logged in user when a artifact is submitted either in a separate field or in the same field (submitted by). Is it possible to show or what type of code change has to be done, to retrieve the current logged in user information when artifact is created. Kindly do needful.
That's not easy. There are two main steps: one surrounding the user; the other surrounding the artifact.
First you would need to edit PFUser.class.php to add the property to the user object. You would also probably need to create a forgeupgrade script (database upgrade) so that you could save and modify the property. Obviously the UI and scripts that manage the CRUD of a user would also need changing.
Next, you would need to create a new type of class that extends TrackerformElementField, make that type available for use in the Tracker Field administration and set its value to the user's contact number.
Unfortunately, any of these modifications, if not done carefully, can have important side-effects. If you want to write your code and have it reviewed by the community, you can go through gerrit.tuleap.net and read the developer guide

is it possible to get an xPages build number?

I do all development in a single application. when a new version is ready I create a template and give it a version number. this way I can store a history of all previous versions.
the development templates are used to push the new design to many applications via replace design.
Creating manual version number or template names is fine but I am looking for a more automatic way of finding out which build the different applications are inherited from
When I visit the different applications I would like to be able to see which build number each application are inherited from. is this possible?
A simple build time stamp could do, but is there a built in build number that can be used and that can be displayed on the xpage.
e.g Build 2012092712345
Update:
Thank you for all your answers, many good suggestions but it looks like all require manual work.
The best solution would be if there is a way to read (from ssjs) a timestamp from any file within the nsf that is always updated during a build. is this possible?
In classic notes, there was a method to add a shared field with a special name to the application. Cannot remember the details, but have it somewhere on the disk.
Then you can see the build number in the design tab of the application properties. And you can of course display the value in your applikation as well.
But you have to fill the item manually on each build. Or use teamstudio Buildmanager. This tool adds the value automatically.
And I also guess that you can write some code that changes the value whenever you create a new build.
Another option would be to use a versioning system like CVS/SVN. This is possible since 8.5.3.
Source control
I think I know what you are meaning. Your a pushing out design and want to check thru code what version each database has. I usually do this with a Build form. In this form I have computed fields with all the data I want to retrieve. Then I open the database with an agent create a document
and set the form field to "BuildForm" and do a computewithform.
Now I can see all information about this database.
I once wrote a rudimentary build system for "classic" Notes, and had the last part of the build pipeline create a form named _BUILDID_, and put the build id in the $Comment field.
The main reason to create a form instead of a shared field was that I could dynamically fetch the form using NotesDatabase.Forms, and open up the desired field.
I sure hope there are simpler solutions nowadays... :-)