How to import data into a PowerApps for Teams table lookup column? - import

I have a table established in PowerApps for Teams. It has a lookup column referring to another table and functions. I have external data that I wish to populate this table with. When using the Dataflow created by the Import function offered by the UI, I cannot put data into this one column. The column is not available for import. All the other data does get pushed into the table as desired. Even one of the Choice columns receives its data.
How would I enable this lookup column to show up and receive data? The solutions I have found have been for the full PowerApps suite for which I don't have a license. They used alternative keys, but I cannot find a means to enable alternate keys in this stripped down version.
I have used data from an excel file both in OneDrive and external.
I have used the data from a SharePoint list to find the same result.
I have attempted to use and Alternate key in the receiving table, but it doesn't seem to be an option for the PowerApps for Teams.
I have formatted the data to be imported to match the lookup options for the receiving table.

Related

Multiple table import from html page

I am a beginner in using databases, I decide upon Postgres as I have learnt it is usable with Python.
I have reports that I receive in the form of html files, each file has multiple (100+) tables when parsed with Pandas data frame function, there is no unique ID common among all tables, and each table has unique columns.
Is it possible to import all tables, and merge them as a single table with ALL the columns in it, and have each report be a single entry in this new table with a PostgreSQL built-in feature, or do I have to develop a data pipeline using python and add them in manually?
I hope my question is clear enough.
Thank you.

Datastore entities imported into Big query tables without id column but __key__.id

We are using Datastore export feature for exporting datastore entities and importing them in Big Query table. It works perfectly fine.
The only problem is that the entities imported in big query tables do not have 'id' column but '__key__.id'.
We are using a hack(which is unnecessary overhead) for now get 'id' column generated by following query written to target table.
select key.id As id ,* from projectId.datasetId.ImportedTable
Actually, this transfer has be automated via cron jobs and there is no scope for such hack there. so wanted to check if there is any way that imported table by itself, expose ID column.
Can the way we generate entities and choose its Primary column strategy play any role here?
As per documentation, when importing data to BigQuery from Datastore, BigQuery creates a RECORD type for each entity unique key. As this is automated it can't be modified the way is it done.
Maybe a good possibility will be updating the entities to add an extra property and copy the ID so this way the column will be named ID instead of key.id once the import is done to BigQuery.
As this feature is not yet included in the export service between Datastore and BigQuery, you can create a Feature Request in Google Cloud Platform's public issue tracker.

data integration-, multiple databases, unique incremental SOR_id using talend

I'm trying to integrate multiple databases using talend and in turn have an SOR_id for each table for auditing purposes. is it possible to map between multiple source tables simultaneously to destination table having an SOR_id which is meant to be auto incremented? Would I have incremental values for each source tables rows
I have approached this using another way as shown in the image so that my SOR_id can be accounted for.

Programmatically replace data table or data field in Tableau

In my company we have 1K+ Tableau workbooks, all using same Vertica data source via multiple-table connection or custom SQL. Often we end up in situation where reports stop working because underlying data source was changed: table renamed, field removed etc.
How can we proactively react to these changes is my question.
Can we try to correct source code of tableau workbooks to batch replace deprecated query parts?
Or can we monitor what data tables are used in the workbook with/without parsing the source code of the workbook to create alert system?
Thanks

What applications do you use for data entry and retrieval via ODBC?

What apps or tools do you use for data entry into your database? I'm trying to improve our existing (cumbersome) system that uses a php web based system for entering data one ... item ... at ... a ... time.
My current solution to this is to use a spreadsheet. It works well with text and numbers that are human readable, but not with foreign keys that are used to join with the other table's rows.
Imagine that I want a row of data to include what city someone lives in. The column holding this is id_city, which is keyed to the "city" table which has two columns: id (serial) and name (text).
I envision being able to extend the spreadsheet capabilities to include dropdown menu's for every row of the id_city column that would allow the user to select which city (displaying the text of the city names), but actually storing the city id chosen. This way, the spreadsheet would:
(1) show a great deal of data on each screen and
(2) could be exported as a csv file and thrown to our existing scripts that manually insert rows into the database.
I have been playing around with MS Excel and Access, as well as OpenOffice's suite, but have not found something that gives me the functionality I mention above.
Other items on my wish-list:
(1) dynamically fetch the name of cities that can be selected by the user.
(2) allow the user to push the data directly into the backend (not via external files/scripts.
(3) If any of the columns of the rows of data gets changed in the backend, the user could refresh the data on the screen to reflect any recent changes.
Do you know how I could improve the process of data entry? What tools do you use? I use PostgreSQL for the backend and have access to MS Office, OpenOffice, as well as web based solutions. I would love a solution that is flexible, powerful, and doesn't require much time to develop or deploy (I know, dream on...)
I know that pgAdmin3 has similar functionality, but from what I have seen, it is more of an administrative tool rather than something for users to use.
As j_random_hacker noted, I've used MS Access for years (since Access 97) to connect to an ODBC Data Source.
You can do this via linking to external tables: (in Access 2010:)
New -> Blank Database
External Data -> ODBC Database -> Link to Data Source
Machine Data Source -> New -> System Data Source -> Select Driver (Oracle, or whatever) -> Finish
Enter a new name for your DSN, the all of the connection parameters, then click OK
Select newly created DSN, hit ok.
You can do so much once Access sees your external table as a linked table, including sorting, filtering, etc. There's one caveat: as far as I can tell, ALL operations happen on the client side unless you're using a pass-through query. That's fine if you're looking at a table with 3000 records. With 2,000,000 records, that hurts. To be clear, all data in the table comes down to the workstation, for all tables being joined, and the join happens client-side, NOT server-side.
There are usually standalone tools for basic database management - e.g., for Oracle and MySQL a free tool called SQL Developer suffices for basic database data entry.
For more complex types (especially involving clobs) I can usually knock an application together in Java+SWT in a day if we already have the model and DAOs available on the Java side. Yeah, you have to put some effort in, but if it will be used regularly in the future then it is probably worth it.
In your case (well, the case where you have bulk imports of data) knocking up some Perl that reads from the CSV and does the city id lookup would be trivial to implement. Maybe a waste for a one-off thing? Depends on the amount of data to import.
I would be surprised if MS Access can't do what you're looking for -- this is basically the exact use case for it. Namely, quickly throwing together a nice UI for a simple CRUD DB application that a spreadsheet doesn't quite stretch to.
This is an answer, technically, but not a recommendation:
I've used Excel and SSIS for importing simple data entry files into MS SQL, but it's not adequate - there's very little ability to control the data, and SSIS is so very touchy, especially when working with Excel.
MS Access does not work well with some non-Microsoft databases. There is an open-source equivalent called Apache OpenOffice Base you may want to try.