I have a tableau prep flow which uses python script. When i run the flow from tableau prep, it works.
However, i am using command line alongside task scheduler to automate the flow. When running the automated flow; i am encountering the error below:
I have already configured the connection to tabpy in the tableau prep itself. Previously this used to work. But i am now getting this error. Should the TabPy configuration be added to the json file used for credentials? if so, in what format should it be?
Got it to work. Just had to include the following command in the json file:
Related
I've got a two part question. I've been asked to have an email sent whenever an error occurs when running a Notebook in Azure Synapse. The Notebook is going to be run in a pipeline, so I asked them why they didn't just have the email sent from the pipeline, since they know how to do it from there. They gave me a long answer which I didn't understand, so I guess it has to be done inside Synapse.
So anyway, do you have any suggestions as to how to do this? I thought of maybe doing a Logic App or Function and having the Logic App or Function called via the PySpark application.
Another way I am trying and do it is inside the PySpark application itself. Anyway I have the following code:
smtp = smtplib.SMTP("smtp-mail.outlook.com", port=587)
smtp.ehlo()
smtp.starttls()
So anyway, when it gets to smtp.starttls(), I get an error message
SMTPNotSupportedError: STARTTLS extension not supported by server.
I only get this error when trying to do it in PySpark in Synapse. When I do it in Python in Visual Studio on my other machine and run it there, I don't get the error, and I can send the email. Do you have any suggestions? Thanks.
Hi i've been trying to execute a custom activity in ADF which receives csv file from the container (A) after further transformation on the data set, transformed DF stored into another csv file in a same container (A).
I've written the transformation logic in python and have it stored in the same container (A).
Error raises here, when i execute the pipeline it returns an error *can't find the specified file *
Nothing wrong in the connections, Is anything wrong in batch Account or pools!!
can anyone tell me where to place the python script..!!!
Install azure batch explorer and make sure to choose proper configuration for virtual machine (dsvm-windows) which will ensure python is already in place in the virtual machine where your code is being run.
This video explains the steps
https://youtu.be/_3_eiHX3RKE
When I run a flow in Tableau Server, it fails with the following error message:
Unfortunately this error is not helpful in understanding the actual cause of the problem.
Is there a way to see the actual underlying error? Or how am I supposed to debug this?
The flow runs fine in my Tableau Prep.
(EDIT: I used state here that I used a different data source to test in prep, but this is no longer true)
Arguably that error log does give you a hint as to what the issue is. The issue is with the Output step. This is most likely due to a permissions error when Tableau Server goes to publish the output since you can do it locally in Tableau Prep.
Are credentials for your flows able to be embedded on server? This will impact whether the output will be accessible. Are all flows run using a service account? Make sure that service account has access to the output location.
If these troubleshooting steps don't work, check the server logs. For this you'll need to check the logs on Tableau Server using the command line to see if there is a more detailed response. If you have the access, run tsm maintenance ziplogs to zip the log files and investigate.
I'm trying to automate the deployment of the solution my team is working on through TFS Build server. One of the steps which executes a PowerShell script on the target machine fails with the following error:
Microsoft ODBC Driver 11 for SQL Server : Login failed for user 'sa'..
The PowerShell script I'm trying to execute does in fact connect to multiple databases using the sa credentials. When I try to execute the same script passing it the exact same arguments by hand (i.e: executing the script from the target machine VM itself) it works like a charm. But when it is being executed as part of the build steps it fails with the aforementioned error.
Is there a way to further debug the issue? It would be great if there is a way to output trace statements from the script so I could have some insight on what is actually going on.
Usually all the related error should reflect in TFS build log. To narrow your issue you can try to connect to the TFS build agent with the credentials used for the build service and manually run the ps script.
If you execute the ps script with your own account, which will not help to the issue. Usually this kind of problems is related to permissions. Your build service account are lack of related permission. Try to add it to Administrator or SQL Administrator group and execute the build again.
Hi I am new to azure and trying to run script job on my cluster; yesterday i was able to to do map reduce streaming job successfully; however today i am stuck when trying to do hive job ; on powershell ise when i type command use-azurehdinsightcluster 'nnn' I am getting specified method not supported
also when i try script file I get error start-azure job not supported...
I I have mysettings.publishsettings file imported and in place. I have azure.psd1
I am connected to azure etc...
one thing confuses me: account name there are three account names: xxxxx#hotmail.com; and one which is similar to my storage account name , and one weird pay-as-you-go
I have tried all of them
I am totally confused please can some one help me ?
Please upgrade to the latest powershell from http://azure.microsoft.com/en-us/documentation/articles/install-configure-powershell/#Install and try again.