Importing data into Cloud SQL instance error Invalid Credentials - google-cloud-sql

I'm trying to import my database using a bucket, but Im getting an error about Invalid credentials, this a second generation of google cloud, after review the operations tab of my instance it looks like the importing data still running. Im not sure if this process still working. Does anybody have idea about how to solve this problem?
Regards

Related

Cannot create a batch pipeline to get data from ZohoCRM with http plugin 1.2.1 to BigQuery. Retuns Spark Program 'phase-1' failed

My first post here and I'm new to Data Fusion and I'm with low to no coding skills.
I want to get data from ZohoCRM to BigQuery. Module from ZohoCRM (e.g. accounts, contacts...) to be a separate table in BigQuery.
To connect to Zoho CRM I obtained a code, token, refresh token and everything needed as described here https://www.zoho.com/crm/developer/docs/api/v2/get-records.html. Then I ran a successful get records request as described here via Postman and it returned the records from Zoho CRM Accounts module as JSON file.
I thought it will be all fine and set the parameters in Data Fusion
DataFusion_settings_1 and DataFusion_settings_2 it validated fine. Then I previewed and ran the pipeline without deploying it. It failed with the following info from the logs logs_screenshot. I tried to manually enter a few fields in the schema when the format was JSON. I tried changing the format to csv, nether worked. I tried switching the Verify HTTPS Trust Certificates on and off. It did not help.
I'd be really thankful for some help. Thanks.
Update, 2020-12-03
I got in touch with Google Cloud Account Manager, who then took my question to their engineers and here is the info
The HTTP plugin can be used to "fetch Atom or RSS feeds regularly, or to fetch the status of an external system" it does not seems to be designed for APIs
At the moment a more suitable tool for data collected via APIs is Dataflow https://cloud.google.com/dataflow
"Google Cloud Dataflow is used as the primary ETL mechanism, extracting the data from the API Endpoints specified by the customer, which is then transformed into the required format and pushed into BigQuery, Cloud Storage and Pub/Sub."
https://www.onixnet.com/insights/gcp-101-an-introduction-to-google-cloud-platform
So in the next weeks I'll be looking at Data Flow.
Can you please attach the complete logs of the preview run? Make sure to redact any PII data. Also what is the version of CDF you are using? Is CDF instance private or public?
Thanks and Regards,
Sagar
Did you end up using Dataflow?
I am also experiencing the same issue with the HTTP plugin, but my temporary way to go around it was to use a cloud scheduler to periodically trigger a cloud function that fetches my data from the API and exports them as a JSON to GCS, which can then be accessed by Data Fusion.
My solution is of course non-ideal, so I am still looking for a way to use the Data Fusion HTTP plugin. I was able to make it work to get sample data from public API end-points, but for a reason still unknown to me I can't get it to work for my actual API.

How to retrieve streamingdata from dataservice and use it in Pentaho CDE Dashboard?

I'm trying to display incoming streamingdata in a Pentaho dashboard. The incoming data are simple strings, which I would just like to display at the dashboard for now.
I created a kettle transformation, in which I bound a dataservice to the last step(MQTT-Producer).
Within spoon, I tested the service and it seems to work fine.
After uploading the kettle file, the service showed up in the service list (http://localhost:9090/pentaho/kettle/listServices).
Working with the dashboard editor, I use 'streaming over dataservices' from the 'DATASERVICES Queries' as my datasource.
At this point I didn't seem to have any success an was just trying out different panel options and dataservice properties.
I was following those tutorials:
https://help.pentaho.com/Documentation/8.2/Products/Data_Integration/Data_Services
https://help.pentaho.com/Documentation/8.2/Products/CTools/Create_Streaming_Service_Dashboard
What is it that I'm doing wrong?
Any help is appreciated.
cheers
update:
I changed the incoming streaming data to be two doubles.
after some more playing around, I did connect to the data service, using an external tool. I did see the expected values within the database. My dashboard, however, still shows this error message:
Error processing component (ccclinechart)
The same kind of error occurs, when I try to view the sample real time dashboard. It can't process the chartComponent. Maybe I need to reconfigure some other things?
Found the mistake.
Something went wrong with the Ports. After switching back to the default(8080) it worked just fine.
There might be another way to adjust your ports-settings to the problem, but the easiest way to deal with this sort of thing is to switch back to default settings.

Angulardart + Mongodb

I just tried angulardart, I want to use mongodb as a database, I use package: mongo_dart, this is my code
main.dart
can be seen, I want to retrieve data from the Mongo database "contact-db" collection "contact-collection" then display it on the console, but I get this error
error in console chrome
"dart_sdk.js:4835 Uncaught core.UnsupportedError.new {Symbol(UnsupportedError.message): "Socket constructor", Symbol(_error): Error
at Object.dart.throw (http://localhost:8080/packages/$sdk/dev_compiler/amd/dart_sdk.js:483…"
what I want to ask is:
Is this error from a package, if there is a solution for this error?
does this error come from my code, if so please give me a solution?
Is there another way to use it so that I can use angulardart with mongodb as a database?
Thank you in advance
First of all, mongodb can only be accessed from a server-side application. Saying that, you would need to create 2 applications, one for client-side written using angular-dart and another one for the server-side using shelf maybe.
Right now the only databases that allows connecting directly from the client-side are firebase and firestore.

Azure Batch support for Data Lake Store Linked Service

I am using a data factory pipeline with a custom activity (configured to run on Azure Batch) that has a data lake store input dataset and output dataset. The data lake store linked service is using service to service auth (service principal) and is working fine while being used in a Copy activity through Copy Wizard. But when used with a custom activity that tries to check if a file is present in the data lake, the activity fails with an error "Authorization is required". Upon using a Azure Blob Store as the input and output datasets, the same custom activity works fine.
Seems like an issue with Azure Batch (Compute node) not able to authorize Data Lake Store. Please help if you have solved the above mentioned problem.
I had this exact same issue about 3 weeks ago. I feel your pain!
This is a Microsoft bug!
After much trial and error and redeployments I raised a support ticket with Microsoft who confirmed that service principal authentication for data lake store currently only works with copy activities. Not with custom activities.
This is the official response I got on Monday 10th April.
The issue happen because of a bug that custom activity’s connector
schema doesn’t match the latest published connector schema. Actually,
we notice the issue on custom activity and have plan to fix & deploy
to prod in next 2 weeks.
Be aware that if you change your linked service back to use a session token etc you'll also need to redeploy your pipelines that contain the custom activities. Otherwise you'll get another error something like the following...
Access is forbidden, please check credentials and try again. Code:
'AuthenticationFailed' Message: 'Server failed to authenticate the
request. Make sure the value of Authorization header is formed
correctly including the signature.
Hope this helps.

MirthConnect Error Database Write Successful but no data found in SQL Server database

I have a channel in Mirthconnect which read HL7 messages and then extract relevant information and write to SQL server database. It is showing some unusual behaviour, on the Mirthconnect Message log it shows "SUCCESS: Database write success" but no data found in the database. It works fine and writes data most of the time but sometimes it does this. Normally if there is an error writing data (executing the Javascript) it shows error details in Mirthconnect and I understand that but how come it is showing "Write success" and then no data in the database.
Can anyone shed some light on this? Anyone experienced this?
Thanks.
It happened to me.
The solution to it lies with the user that access the database from Mirth. Grant sysadmin server role and public to that user.
Login to the database(SSMS).
Go to Security->login->select your user->right-click properties->server roles. set it as public and sysadmin both.
click OK. And then restart the MSSQLSERVER service from SQL Server Configuration Manager. This is required otherwise the changes will not take effect.