I'm trying to load a csv file into a table and receiving the following error:
The steps that I am doing are launching the dashDB instance that I have, Load, Load from Desktop, Browse for the csv... I have tried to create a new table from the csv and add to an existing table, both resulting with the error. I have also created a dummy csv with just 1 record in it and it is also failing with the same error. When creating the table from the load, the result is the table getting created with no data loaded.
The error you are getting is an already known bug of the Bluemix Dashboard integration with the dashDB one:
The team is working on this issue, but a fully working workaround is available.
In order to complete your data loading through your file you should:
access your application section on the Bluemix dashboard
click on the 'Environment variables' section and select the 'VCAP_SERVICE' button
retrieve the 'https_url' of the dashDB service and save also the related username and password values
paste on your browser url the https_url value, it will open directly the dashDB dashboard, you can now login using the username and password found in VCAP_SERVICE section
click on 'Load' section in the left menu of the dashDB dashboard
in this section you could upload your file without any problem.
Related
I have an issue. Is that possible for Logic Apps to trigger Databricks notebook in its pipeline?
I want to automate processing of downloaded files from Sharepoint and trigger notebook of Databricks.
Starting to build the custom logicapp connector but struggling according to this guide: https://medium.com/#poojaanilshinde/create-azure-logic-apps-custom-connector-for-azure-databricks-e51f4524ab27
Unfortunately struggling to create json Openapi file (image below)
1.Created a Databricks resource in Azure.
Created a Logic Apps Custom Connector resource with name Databricksconnector.
Once created clicked on edit option as shown below,
Followed the document https://medium.com/#poojaanilshinde/create-azure-logic-apps-custom-connector-for-azure-databricks-e51f4524ab27
by # Pooja Shinde
and performed below steps.
Connector creation I have kept all with default values in this section.
In General Information, selected scheme as Https and in host give data bricks url as https://databricksinstancename.azuredatabricks.net
In Security, selected authentication as basic as shown below,
In Definition tab, added new action by clicking on New action
In General tab, provided Summary, Description, Operation ID and Visibility selected as none.
In Request tab, Clicked on Import from sample.
I am retrieving job details by passing job_id and details are,
Request Type: Get
Url: api/2.0/jobs/get?job_id=*******
Headers: Content-Type: application/ json
12.Click on import and request will be update as below,
Next click on Update connector and connector will be saved.
Connector saved successfully as shown below,
We can view all the provided details in json format by clicking on swagger editor,
Now created a logic app with recurrence trigger.
Able to see created custom connector as shown below,
Created connection for Azure data bricks as per the document https://medium.com/#poojaanilshinde/create-azure-logic-apps-custom-connector-for-azure-databricks-e51f4524ab27
Added parameter “job_id” and given value as shown below,
20. logic app ran successfully and able to get job details,
Reference: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/api/2.0/jobs?source=recommendations#--runs-list
I am a new user following this tutorial provided by IBM.
I am up to this step:
For Cloudant Instance, select Input your own credentials and fill in the following fields with the credentials information captured for your cloudant service: Username, Password, Host and Database = guestbook and click Add and then Save.
After following the instruction to 'Add', I am returned to the sequence view (with the list of actions representing the sequence I'm working on).
Expected: The newly created public action w/ binding should appear in the list.
Instead: The newly created public action is not in the list. There is no evidence of it having been created at all. There is no option to 'save'.
Am I doing something wrong? This seems like an enormous bug.
Attempted solutions (unsuccessful):
Log out and back in.
Create new Cloudant service credentials.
Enter service credentials manually vs via dropdown.
Create action in a named package rather than default package.
Create new Cloudant service credentials, selecting a specific service ID.
PS
Attempted to create support ticket but needed to upgrade account by adding credit card. Filled in card information. Card rejected: "Error: Could not place order. Unable to verify the credit card. Declined due to Risk management". I use this card successfully all the time.
In actions UI, selected a sequence, added an action to the sequence, 'reset' sequence to discard changes, began to add yet another action, cancelled that new action, returned to sequence view and the previously created action that I had discarded was there. Ie seems like some backend / database propagation issues on IBM's end?
The steps have been updated in https://cloud.ibm.com/docs/tutorials?topic=solution-tutorials-serverless-api-webapp#sequence-of-actions-to-save-the-guestbook-entry.
To create the new Cloudant binding:
Set Name to binding-for-guestbook.
Set Instance to Input your own credentials.
Set Username, Password, Host and IAM API Key from the values found in the Cloudant credentials for-guestbook created earlier.
Set Database to guestbook.
Set whiskoverwriteLabel to true.
Save
I am trying to create a few visualizations from BigQuery data on Google Data Studio.
Following are the simulation steps:
Login into the BigQuery console
After running a query on the new Bigquery console, click the option "EXPLORE IN DATA STUDIO".
Create any chart or table
Click "SAVE" on the top right corner
This will generate an error:
User Configuration Error This data source was improperly configured. Error ID: fb2ea24f
Page with default table after clicking "EXPLORE IN DATA STUDIO":
Table disappears after clicking "SAVE":
Error details shown on pop-up after clicking "See Details":
I am new to Azure Database, I am using cosmosDB (MongoDB), I have connected to the CosmosDB, via string URL. When I am storing data through the collection it is not showing any exception on my application infact it is showing status success when I check the logs, but it is not visible on data explorer. When I am calling the complete data I am able to retrieve it also..
Only problem is that it is not visible on the mongoDb console neither it is visible on data explorer.
data explorer image
To see the data, click on the Documents node of the testTable database in the Data Explorer (records are called Documents in DocumentDB).
Here's an example:
I'm testing backand platform and got this error
http://localhost:8100/?ionicplatform=ios&error=%7B%22message%22:%22Cannot%20insert%20duplicate%20key%20row%20in%20object%20%27dbo.durados_UserSocial%27%20with%20unique%20index%20%27IX_durados_UserSocial_UserId_Provider%27.%20The%20duplicate%20key%20value%20is%20(387068,%20facebook).%0D%0AThe%20statement%20has%20been%20terminated.%22,%22provider%22:%22facebook%22%7D#/tabs/dashboard
This is a project started from http://market.ionic.io/starters/ionic-backand-with-social according to Getting started mobile.
I couldn't get any match on google
I think it could be because I'm logging in with the same account that I created the app in backand, however, shouldn't it be possible to just make a login instead of try to register it (or whatever it's trying to do)?
This worked with project default keys for facebook.