How to schedule reports in Pentaho User Console.
Also let me know how to email scheduled reports particular address in
Pentaho User Console.
You will not be able to add Email Functionality in Pentaho user console, for that you have to use pentaho data integration, what i will suggest is use pentaho data integration, run the report using PDI, schedule a job and add email step and send the report to specified email address.
Related
We are trying to connect to Microsoft SQL Server installed in an Azure VM (IaaS) from Datastage using API.
Currently, we are using JDBC connector to connect to Microsoft SQL Server (IaaS) using a service account and its password. But, on a new server, we have to reset the password every three months in Azure. Also, same service account is being used by other applications.
We have to create the change request to reflect the new password in the datastage PROD environment. Also, we are getting separated service account to use in Datastage.
To avoid the password reset or lock issue, we are planning use API to get the password for connecting to the DB.
API DB connection is working in Alteryx. Can you please let us know is it possible to connect and ways to do so in Datastage 11.7.1.2. Also, please let me know any other feasible solution for this problem, if the API connection is not possible.
I assume you know how to fetch the password via command line interface from your cloud service.
Store the password as datastage environment variable which is then used in the job.
Use a shell script to update the password. In the script, check first if the password has changed. If it did, run the dsadmin -envset command to set the environment variable to a new value. You might need to encrypt the new value using the encrypt command located in .../ASBNode/bin. Call the script every time before running the parallel job.
You should test if the change of an environment variable will be recognized by the job just in time when the script and the job are called by the same sequence. It might not work if the param is passed-through by the sequence.
Please read the IBM docs about the commands I mentioned.
I have a PowerBI report with 2 tables sourced from separate web API calls to a similar service, let's call it MyService. The service returns json. The report refreshes perfectly within PBI Desktop. The report publishes to PowerBI.com where it runs without a hitch. On-demand refresh of the report's dataset in the PBI Service works perfectly too.
I can't schedule a refresh for this report because the option is greyed out. There is an error reported in the Data source credentials section where one of the two Web sources reports it cannot connect to MyService. The error is... "Your data source can't be refreshed because the credentials are invalid. Please update your credentials and try again."
Attempting to edit the credentials for the failing connection results in a 500 Internal Server Error.
This error is unexpected because I understood the on-demand refreshes & scheduled refreshes would use the same data source(s). There is no where to specify different data sources/credentials for on-demand vs. scheduled so I just assume they share the same sets of credentials.
Is something weird going on or does my understanding of the innards of the PBI Service need realignment?
Resolved via workaround...
PROBLEM
Dataset credentials cannot be updated without causing 500 error. Invalid credentials disables scheduled refresh options. When publishing a .pbix file via PBI Desktop to PBI Service it may fail to update the dataset connection in the service, leaving it in an invalid state. Refreshes cannot be scheduled while there are any invalid dataset connections.
WORKAROUND
Open the same .pbix file via PBI Service (i.e. PowerBI.com --> GetData), and the dataset connection will be updated. Credentials can now be set without error, thus allowing scheduled refreshes to be set.
If schedule refresh to be worked without gateway, then your data source should be online/cloud or on sharepoint. else you require gateway. ensure all your data sources for .pbix file has cloud only/share point only as source, if its from offline file & online then simply scheduling does not work, you need to setup gateway.
I have published report on power bi service using powershell cmdlets, but after being published to see the reports i have to provide credentials manually in power bi service.
Right now I am using this API "Data-sets - Update Data-sources In Group" to configure the connections details but in this i have to manually change credentials from power bi service.
NOTE: I'am using direct query to get data for my reports
This is the error I am getting,
I have published an extract over Tableau Server which I need to update into every 10 minutes.I am able to do so when a single connection is involved using tableau command line.
tableau refreshextract --server https://online.tableau.com --username user1 --password pass --project project --datasource data_123 --source-username connection1 --source-password connection1
How I can refresh the extract if extract was generated from two or more connections.
I have tried above commands but it's not working.
Error There is no active connection to the data source
Set up a schedule on the server. Sign in to the server, go to the Schedules page, and click New Schedule. Set the recurrence to be every 10 minutes.
Enable scheduled extract refreshes and failure emails. As a server or site administrator, you can enable schedules, as well as email notification when extract refreshes fail.
Select Settings, and then go to the General page.
Under Email Notification, select Send email to data source and workbook owners when scheduled refreshes fail.
Under Embedded Credentials, select both check boxes to allow publishers to embed credentials and schedule extract refreshes.
Publish a workbook with an extract. In Tableau Desktop, select Server > Publish Workbook. Sign in to the server if you’re not already. In the Publish Workbook to Tableau Server dialog box , click Schedules & Authentication. Under Extract Schedule, select the schedule from the list.
See the Tableau Quick Start at https://onlinehelp.tableau.com/current/server/en-us/qs_refresh_extracts.html
I have few SSRS 2008 reports. Databases are CRM databases. I have created a group of 10 users. Each user has different permissions(user can see data of only those databases which he has access from CRM side security).
When user tried to see reports from his place(machine) every time he gets this error.
An error has occurred during report processing. (rsProcessingAborted)
Cannot create a connection to data source 'DB_NAME'. (rsErrorOpeningConnection)
Cannot open database "CRM_Database" requested by the login. The login failed. Login failed for user 'NTAUTHORITY\ANONYMOUS LOGON'.
I am using windows authentication. within the server reports are working fine. Outside the server we are getting this error.i got few suggestions that its a double hop issue.Solutions can be :
Use stored credential. (In my case I can't use because every user has access to different database. He can select database in reports whatever he has access to and he will get data only for that database.)
Kerberos setting.( I don't know how to do that when you have Windows 7 and SQL 2008 R2)
Help would be appreciated.
"NTAUTHORITY\ANONYMOUS LOGON" is the built in IIS account on your report server. The reports are being executed from this account which serves up the page to the user.
Update your Data Source to use "Connect using: Credentials supplied by the user running the report" and checking "Use as Windows credentials" (Kerberos), if their AD account is setup with the appropriate DB permissions on the SQL Server. when connecting to the Data Source. Windows integrated security works also if you are on the domain.
Since you need to pass the user's account to the DB for authentication using credentials stored securely on server (Stored Credential) will not work for the scenario you describe as every user will hit the database with the same credentials.