Postgres/PGAdmin exporting empty CSV - postgresql

I've generated a large table (1.1 million rows) in Postgres/PGAdmin that I'd like to export to CSV. When I click the "Save results to file (F8)" button, I get the "Downloading Results..." spinning wheel, and then the window letting me name the CSV and save it where I want on my computer. But once that CSV is on my computer it's empty. I've tried restarting PGAdmin and my computer but it's still happening.
Does anyone know why this is happening / how to fix? I would just copy/paste the table into a text file, but I think it's too large.
I couldn't find any prior questions about this.
Thanks.

I ran into the same problem. My solution was to create a table using your intended select statement.
Example:
CREATE TABLE query_results AS (
Select * from XXXX
)
Then, export the table to CSV.

Related

How to copy data from an a csv to Azure SQL Server table?

I have a dataset based on a csv file. This exposes a data as follows:
Name,Age
John,23
I have an Azure SQL Server instance with a table named: [People]
This has columns
Name, Age
I am using the Copy Data task activity and trying to copy data from the csv data set into the azure table.
There is no option to indicate the table name as a source. Instead I have a space to input a Stored Procedure name?
How does this work? Where do I put the target table name in the image below?
You should DEFINITELY have a table name to write to. If you don't have a table, something is wrong with your setup. Anyway, make sure you have a table to write to; make sure the field names in your table match the fields in the CSV file. Then, follow the steps outlined in the description below. There are several steps to click through, but all are pretty intuitive, so just follow the instructions step by step and you should be fine.
http://normalian.hatenablog.com/entry/2017/09/04/233320
You can add records into the SQL Database table directly without stored procedures, by configuring the table value on the Sink Dataset rather than the Copy Activity which is what is happening.
Have a look at the below screenshot which shows the Table field within my dataset.

Why does my tableau tooltip get changed when I change my data source?

I am moving from a csv to a postgresSQl for my tableau workbook. Both of them have the same field names, exact data types. However, when I change my data source the tooltip gets a random text which breaks the filter in the toolip viz
I tried replacing the csv file with the same csv file and the same thing happened. So I think this is a tableau issue and not a database issue
<Sheet name="Tooltip: Level 2 Site Scores" maxwidth="300" maxheight="300" filter="<Site>,<[federated.02mez2l0u2i0o018sk45f0skmrv7].[none:level_2:nk]>"> (This is what happens)
<Sheet name="Tooltip: Level 2 Site Scores" maxwidth="300" maxheight="300" filter="<Level 2>,<Site>"> (This is what I want)
The 'Level 2' field gets messed up for some reason
If you open Tableau desktop file in a text editor, you'll see that it's an XML. In an example, I have a file with the following line. Tableau assigns a unique id to calculated fields I create, this is the "Calculation_104990228446113793"
<column-instance column='[Calculation_104990228446113793]' derivation='None' name='[none:Calculation_104990228446113793:nk]' pivot='key' type='nominal' />
The same can be seen for data source references.
<datasource caption='my_data_source' inline='true' name='federated.0dbu8r50hqicaj1fm4f2b1r4o814' version='10.5'>
So when you swap a data source, unique id's change and cause the error you're seeing. Not sure if your issue is a bug or not, you could report it. But this is what is happening in your case.

phpMyAdmin -- import one table into existing database

I have a new table I want to add to an existing db, the structure of which I exported to a file table.sql.
table.sql has 75 columns, so naturally I would rather find a way to copy/import the structure into the existing db than creating a new table and manually defining each of the 75 fields.
Is there a way to import this table structure into my database mydb (which is populated with data)? There has to be -- this is computing. I am staring at phpmyadmin and can't figure out how to do this.
Any suggestions would be welcome.
Thanks in advance.
okay, figured out how to do it. I just opened the exported .sql file, and copied/pasted the "CREATE TABLE..." statement into the DB's SQL window in phpMyAdmin and it worked -- I now have that new 75-column table.
But I'm still a bit mystified over why phpmyadmin gave a success message when I tried to import the .sql file but did not display the allegedly imported table.
There is an Import tab in phpMyAdmin.
Open the database you would like to import into, then click on the "Import" tab and upload your file.
You can learn more here: http://www.techrepublic.com/blog/smb-technologist/import-and-export-databases-using-phpmyadmin/
An alternate way would be to copy the mySQL statement(s) in your table.sql file, open the "SQL" tab, paste it in the box, and then the run the query.
I hope this helps!
It's easy. You should choose your database then in the Import tab, select choose file. Select your file then press Go. It worked for me. You can find the new table in your database.

Kentico Import Toolkit 8.1

I am currently using the Kentico Import Toolkit to create documents in the tree.
At this point, I have imported around 100 documents using the toolkit, and they are all located at the correct place in the tree. Now the issue/concern that I had was, as I have imported these documents, my spreadsheet has been updated, so extra fields and data were added, so how do I go about importing this extra data into the currently existing documents? Also just bear in mind I don't want other fields or data to be affected by this, as some of the documents were updated with some other content by the content editors using CMS Desk, which isn't available in the spreadsheet.
Import toolkit is not the right tool to achieve this task. Even if you select "Import new and overwrite existing pages" it'll overwrite most of your columns. Actually it only preserves system and id columns from the existing documents - all other columns get overwritten.
Either you can write a piece of custom code or you can try following:
Open SSMS and navigate to the coupled table of your page type (something like CONTENT_MyDocType). This is where your custom columns are stored.
Right click -> Edit top 200 rows
Click "Show SQL Pane"
Adjust the columns, ORDER BY and WHERE clause to match your excel file, re-run the query
Select desired rows in your excel file and copy them to clipboard
Paste the data in the SSMS
rocky is right, Import Toolkit is meant for importing complete objects, not partial/continuous update.
You could map the fields that you know are not changed in the spreadsheet to a SQL query selecting the value from the target database.
To achieve this, just insert #<target> at the beginning of the SQL select statement you will be mapping the field to.
It will be rather laborious though and it also requires certain knowledge about the nature of the spreadsheet changes.

Generating all 'data dictionary' reports under each 'object' in postgres

I have a database with about 50 something tables. I would like to run the report "Data Dictionary" on each table.
Ideally, I would like them all to be in one report, for example, in PGAdminIII, if I right click select "Tables" I will get a report of all the 'objects' and under each one a data dictionary report.
Is there an automatic way of doing this, or an plugin that I can install to postgres? Or is there something analogous to this?
If I understand correctly, you're referring to the ability to right mouse click on a table in PgAdminIII and select Reports > Data Dictionary report?
I'm not aware of any way to do that from PgAdminIII. You could look into using a different tool such as SchemaSpy. Another option (as alluded to by #kgrittn) is to use psql \d with the \H flag to generate html output. My solution (since SchemaSpy didn't do what I needed, and I needed the same output for both Postgres and Oracle) was to roll my own using perl, DBD::Pg and Template::Toolkit.
Update: Added GitHub link.
I wrote a fairly simple Postgres data dictionary generator in Python that spans all schemas and tables within a specified database. If it doesn't have exactly what you want it would be fairly easy to modify.
https://github.com/kylejmcintyre/pypostgreports