Convert PBI Desktop dataset to push dataset - powershell

I have a report and dataset created in Power BI Desktop and published to Power BI Online. I need to update the rows of this dataset from an API and try to use a REST API, but I can only update a push dataset and look for a method to convert my normal dataset. Would it be possible to convert or copy the existing structure to facilitate the creation of this Push Dataset?
I tried converting using a Windows PowerShell module PowerBIPS, but that module has had an error since last year and apparently doesn't work anymore.

Related

Exporting AnyLogic dataset log files as outputs in Cloud model

I want to upload my AnyLogic model to AnyLogic Cloud through the run configuration process, and for outputs, I want to add several dataset log files. I was wondering how can I do that?
Just add the datasets themselves as model outputs; graphs for them will then be included in the Cloud experiments' dashboards.
The AnyLogic Cloud doesn't currently support having Excel outputs from your models (just Excel inputs), so you can't use AnyLogic-database-dataset-logs exported to Excel. But you can now (since the most recent version) download the outputs from all your model runs in JSON format (which would include the dataset data).
Otherwise you have to do it 'Cloud-natively' instead; e.g., write relevant output data to some Cloud-based database (like an Amazon database). That obviously requires the Java knowledge on how to do that (and an AnyLogic Cloud subscription, or Private Cloud installation, to be able to access external resources).

Do I need a storage (of some sort) when pulling data in Azure Data factory

*Data newbie here *
Currently, to run analytics report on data pulled from Dynamics 365, I use Power BI.
Issue with this is, Power BI is quite slow processing large data. I carry out a number of transform steps (e.g. Merge, Join, deleting or renaming columns, etc). So, when I try to run a query in Power BI with said steps, it takes a long time to complete.
So, as a solution, I decided to make use of Azure Data Factory(ADF). The plan is to use ADF to pull the data from CRM (i.e. Dynamics 365), perform transformations and publish the data. Then I'll use Power BI for visual analytics.
My question is:
What azure service will I need in addition to Data Factory? Will I need to store the data I pulled from CRM somewhere - like Azure Data Lake or Blob storage? Or can I do the transformation on the fly, right after the data is ingested?
Initially, I thought I could use the 'copy' activity to ingest data from CRM and start playing with the data. But using the copy activity, I needed to provide a sink (destination for the data. Which has to be a storage of some sort).
I also thought, I could make use of the 'lookup' activity. I tried to use it, but getting errors (no exception message is produced).
I have scoured the internet for a similar process (i.e. Dynamics 365 -> Data Factory -> Power BI), but I've not been able to find any.
Most of the processes I've seen however, utilises some sort of data storage right after data ingest.
All response welcome. Even if you believe I am going about this the wrong way.
Thanks.
Few things here:
The copy activity just moves data from a source, to a sink. It doesnt modify it on the fly.
The lookup activity is just to look for some atributes to use later on the same pipeline.
ADF cannot publish a dataset to power bi (although it may be able to push to a streaming dataset).
You approach is correct, but you need that last step of transforming the data. You have a lot of options here, but since you are already familiar with Power Bi you can use the Wrangling Dataflows, which allows you to take a file from the datalake, apply some power query and save a new file in the lake. You can also use Mapping Dataflows, databricks, or any other data transformation tool.
Lastly, you can pull files from a data lake with Power Bi to make your report with the data on this new file.
Of course, as always in Azure there are a lot of ways to solve problems or architect services, this is the one I consider simpler for you.
Hope this helped!

How to publish data blended data sources?

I have a workbook with datasources blended and calculated fields referencing these datasources.
I get an error alert publishing to tableau server.
How do I publish the data sources?
I use tableau 2019
Create an extract of the blended data source, and publish that extract to the server. This extract will included all the calculated fields that are included in the current workbook.
Generally for this type of use case I suggest publishing your data extracts separately. They can contain your calculated fields. Publishing the calculated fields is actually more performant anyway, many of them become materialised.
You can connect to both server data sources in desktop, and perform the data blending within the Tableau workbook using the published data sources. You can create additional calculated fields, including those from data blends, within the workbook. You're then able to publish the workbook and those calculated fields will remain at the workbook level, not the data source.
Other users connecting to the data sources will only see the calculations published into the extract, not the calculations published with the workbook.
Hope that makes sense.

Possible to modify or delete rows from a table in BigQuery dataset with a Cloud Data Fusion pipeline?

I have a requirement to build a Data Studio dashboard and to use data from BigQuery dataset.
I have imported my data to BQ using Data Fusion from an on-premise MS SQL server, and the requirement is I have to delete the last 5 days of the records and import new updated records for the same time range on top of the records in the BQ dataset...
So far I was able to do all the work with the pipeline but when I run the pipeline it does append the data again into the BQ table and I end up with duplicate data.
I am looking to a way to do some manipulation to the data in BQ before it receives new data from the pipeline. Is there anything available in Data Fusion that can help with this?
Regards
We recently added this functionality to the google-cloud plugins. You can check the changes here - Google-Cloud-Plugin PR#140. You can either wait for the newer version of google-cloud plugins to be released or you can build it locally and install the plugin in Data Fusion instance you are testing this.
Hope this helps.

How to import datasets as csv file to power bi using rest api?

I want to automate the import process in power bi, But I can't find how to publish a csv file as a dataset.
I'm using a C# solution for this.
Is there a way to do that?
You can't directly import CSV files into a published dataset in Power BI Service. AddRowsAPIEnabled property of datasets published from Power BI Desktop is false, i.e. this API is disabled. Currently the only way to enable this API is to create a push dataset programatically using the API (or create a streaming dataset from the site). In this case you will be able to push rows to it (read the CSV file and push batches of rows, either using C# or some other language, even PowerShell). In this case you will be able to create reports with using this dataset. However there are lots of limitations and you should take care of cleaning up the dataset (to avoid reaching the limit of 5 million rows, but you can't delete "some" of the rows, only to truncate the whole dataset) or to make it basicFIFO and lower the limit to 200k rows.
However a better solution will be to automate the import of these CSV files to some database and make the report read the data from there. For example import these files into Azure SQL Database or Data Bricks, and use this as a data source for your report. You can then schedule the refresh of this dataset (in case you use imported) or use Direct Query.
After a Power BI updates, it is now possible to import the dataset without importing the whole report.
So what I do is that I import the new dataset and I update parameters that I set up for the csv file source (stored in Data lake).