IBM Unica Campaign External CSV File Import - ibm-cloud

I would like to import external csv file to maillist box.
Is there any way to import external csv file to Unica maillist box? Thank you.

In the edit mode of your flowchart, select the Admin drop down and choose Tables.
Map your .csv data as a base record table from your data source.
Make sure that the audience is the same as your MailList so that your data will show as an option.
then you should be able to run your Mail List process using your data.

Related

Azure Data Factory DataFlow Source CSV File Header Keep Changing

I am trying load the CSV file from source blob storage and option selected for first row as a header but while doing multiple time debug trigger, the header keep changing, so that i could not able to insert the data to target SQL DB.
kindly suggest and how do we handle this scenario. i am expecting static header needs to configure from source or else existing column i would have to rename into adf side.
Thanks
In Source settings "Allow Schema drift" needs to be ticked.
Allow Schema Drift should be turned-on in the sink as well.

Data Flow Partition by Column Value Not Writing Unique Column Values to Each Folder

I am reading an SQL DB as source and it outputs the following table.
My intention is to use data flow to save each unique type into a data lake folder partition probably named as specific type.
I somehow manage to create individual folders but my data flow saves the entire table with all types into each of the folders.
my data flow
Source
Window
Sink
Any ideas?
I create a same csv source and it works well, please ref my example.
Windows settings:
Sink settings: choose the file name option like this
Note, please don't set optmize again in sink side.
The output folder schema we can get:
Just for now, Data Factory Data Flow doesn't support custom the output file name.
HTH.
You can also try "Name folder as column data" using the OpType column instead of using partitioning. This is a property in the Sink settings.

Re - Purpose Stored Procedure

Ok So I want to import Sales orders from Flat file or Excel via an FTP connection . I have a copy of the Sales import SP that our ERP uses but i want to re-Purpose it to read a file direct from a folder location rather than importing it through the ERP UI . I'm not sure how to go about doing this
I have read about flat file and bulk inserts but I dont think this is the correct route , I have included the available SP I have
I would to like to find a way to re-purpose this SP , I have not added the SP as its a bit large

Is there a REST api of DashDB to export the content of a database in json format?

I need to re-create a database from a DashDB Bluemix service into another. And I need to automate this procedure in bash scripts.
The best I can think of is a DashDB REST API that allows me to export the content of the entire database into json format (or any other format you can think of), and a corresponding API that allows me to re-import the content in a different database on the same service or on a different service, possibly in a different Bluemix space. Thanks.
I assume you want to do a one time move and this is not about a continuous replication. In that case simply sign up on http://datascience.ibm.com, navigate to DataWorks, select "Load Data" from navigation panel (open it clicking top left) and then select Cloud Database as source type.
DataWorks load data from dashDB to dashDB
If you however still would prefer to write an own app or script that does the data movement and you want a REST API to export JSON data, then I recommend to write a simple R script that reads the data from a table (using ibmdbR) and writes it to stdout, deploy the script into dashDB (POST /home) and run the R script from your app/script calling /rscript endpoint: https://developer.ibm.com/clouddataservices/wp-content/themes/projectnext-clouddata/dashDB/#/
For Db2 on Cloud and Db2 Warehouse on Cloud, there is a REST API available that allows you to export data from a table in CSV format (up to 100.000 rows) and then load the data back. It will require a few requests as:
POST /auth/tokens
GET /sql_query_export
POST /home_content/{path}
POST /load_jobs
GET /load_jobs/{id}
I've implemented a client npm module for this API - db2-rest-client and you can export a statement result to a JSON file as:
export DB_USERID='<USERID>';export DB_PASSWORD='<PASSWORD>';export DB_URI='https://<SOURCE_HOSTNAME>/dbapi/v3'
db2-rest-client query --query="SELECT * FROM SRC_SCHEMA.SRC_TABLE" > test.json
Then you can transform that data into a .csv file and use the load job:
export DB_USERID='<USERID>';export DB_PASSWORD='<PASSWORD>';export DB_URI='https://<TARGET_HOSTNAME>/dbapi/v3'
db2-rest-client load --file=transformed.csv --table='DEST_TABLE' --schema='DEST_SCHEMA'

How to import users in CRM 2011 with source GUID

We have three Organization tenents, Dev, Test and Live. All hosted on premise (CRM 2011. [5.0.9690.4376] [DB 5.0.9690.4376]).
Because the way dialogs uses GUIDs to refference record in Lookup, we aim to maintain GUIDs for static records same across all three tenents.
While all other entities are working fine, I am failing to import USERS and also maintain their GUIDS. I am using Export/Import to get the data from Master tenent (Dev) in to the Test and Live tenents. It is very similar to what 'configuration migration tool' does in CRM 2013.
Issue I am facing is that in all other entities I can see the Guid field and hence I map it during the import wizard but no such field shows up in SystemUser entity while running import wizards. For example, with Account, I will export a Account, amend CSV file and import it in the target tenant. When I do this, I map AccountId (from target) to the Account of source and as a result this account's AccountId will be same both in source and target.
At this point, I am about to give up trying but that will cause all dialogs that uses User lookup will fail.
Thank you for your help,
Try following steps. I would strongly recommend to try this on a old out of use tenant before trying it on live system. I am not sure if this is supported by MS but it works for me. (Another thing, you will have to manually assign BU and Roles following import)
Create advance find. Include all required fields for the SystemUser record. Add criteria that selects list of users you would like to move across.
Export
Save file as CSV (this will show the first few hidden columns in excel)
Rename the Primary Key field (in this case User) and remove all other fields with Do Not Modify.
Import file and map this User column (with GUID) to the User from CRM
Import file and check GUIDs in both tenants.
Good luck.
My only suggestion is that you could try to write a small console application that connects to both your source and destination organisations.
Using that you can duplicate the user records from the source to the destination preserving the IDs in the process
I can't say 100% it'll work but I can't immediately think of a reason why it wouldn't. This is assuming all of the users you're copying over don't already existing in your target environments
I prefer to resolve these issues by creating custom workflow activities. For example; you could create a custom workflow activity that returns a user record by an input domain name as a string.
This means your dialogs contain only shared configuration values, e.g. mydomain\james.wood which are used to dynamically find the record you need. Your dialog is then linked to a specific record, but without having the encode the source guid.