How to list all tables in sqlite.swift - swift4

Apparently I'm missing something here.
Given you want to connect to an unknown sqlite database with multiple tables, is there a way to call something like .tables on
let db = try Connection("path/to/db.sqlite3")
to retrieve a list of tables and then create a reference to it.
i tried (amongst other things)
let statement = try db?.prepare("select * from sqlite_master where type='table'")
let tables = try statement!.run()
which returned a list of bindings, I have no idea how to deal with.

Related

How to use pyodbc to migrate tables from MS Access to Postgres?

I need to migrate tables from MS Access to Postgres. I'd like to use pyodbc to do this as it allows me to connect to the Access database using python and query the data.
The problem I have is I'm not exactly sure how to programmatically create a table with the same schema other than just creating a SQL statement using string formatting. pyodbc provides the ability to list all of the fields, field types and field lengths, so I can create a long SQL statement with all of the relevant information, however how can I do this for a bunch of tables? would I need to build SQL string statements for each table?
import pyodbc
access_conn_str = (r'DRIVER={Microsoft Access Driver (*.mdb, *.accdb)}; 'r'DBQ=C:\Users\bob\access_database.accdb;')
access_conn = pyodbc.connect(access_conn_str)
access_cursor = access_conn.cursor()
postgres_conn_str = ("DRIVER={PostgreSQL Unicode};""DATABASE=access_database;""UID=user;""PWD=password;""SERVER=localhost;""PORT=5433;")
postgres_conn = pyodbc.connect(postgres_conn_str)
postgres_cursor = postgres_conn.cursor()
table_ditc = {}
row_dict = {}
for row in access_cursor.columns(table='table1'):
row_dict[row.column_name] = [row.type_name, row.column_size]
table_ditc['table1'] = row_dict
for table, values in table_ditc.items():
print(f"Creating table for {table}")
access_cursor.execute(f'SELECT * FROM {table}')
result = access_cursor.fetchall()
postgres_cursor.execute(f'''CREATE TABLE {table} (Do I just put a bunch of string formatting in here?);''')
postgres_cursor.executemany(f'INSERT INTO {table} (Do I just put a bunch of string formatting) VALUES (string formatting?)', result)
postgres_conn.commit()
As you can see, with pyodbc I'm not exactly sure how to build the SQL statements. I know I could build a long string by hand, but if I were doing a bunch of different tables, with different fields etc. that would not be realistic. Is there a better, easier way to create the table and insert rows based off of the schema of the Access database?
I ultimately ended up using a combination of pyodbc and pywin32. pywin32 is "basically a very thin wrapper of python that allows us to interact with COM objects and automate Windows applications with python" (quoted from second link below).
I was able to programmatically interact with Access and export the tables directly to Postgres with DoCmd.TransferDatabase
https://learn.microsoft.com/en-us/office/vba/api/access.docmd.transferdatabase
https://pbpython.com/windows-com.html
import win32com.client
import pyodbc
import logging
from pathlib import Path
conn_str = (r'DRIVER={Microsoft Access Driver (*.mdb, *.accdb)}; 'rf'DBQ={access_database_location};')
conn = pyodbc.connect(conn_str)
cursor = conn.cursor()
a = win32com.client.Dispatch("Access.Application")
a.OpenCurrentDatabase(access_database_location)
table_list = []
for table_info in cursor.tables(tableType='TABLE'):
table_list.append(table_info.table_name)
for table in table_list:
logging.info(f"Exporting: {table}")
acExport = 1
acTable = 0
db_name = Path(access_database_location).stem.lower()
a.DoCmd.TransferDatabase(acExport, "ODBC Database", "ODBC;DRIVER={PostgreSQL Unicode};"f"DATABASE={db_name};"f"UID={pg_user};"f"PWD={pg_pwd};""SERVER=localhost;"f"PORT={pg_port};", acTable, f"{table}", f"{table.lower()}_export_from_access")
logging.info(f"Finished Export of Table: {table}")
logging.info("Creating empty table in EGDB based off of this")
This approach seems to be working for me. I like how the creation of the table/fields as well as insertion of data is all handled automatically (which was the original problem I was having with pyodbc).
If anyone has better approaches I'm open to suggestions.

Is there a SQLite for Swift method for more complex ORDER BY statements?

I have a query similar to the following that I'd like to perform on an sqlite database:
SELECT name
FROM table
WHERE name LIKE "%John%"
ORDER BY (CASE WHEN name = "John" THEN 1 WHEN name LIKE "John%" THEN 2 ELSE 3 END),name LIMIT 10
I'd like to use SQLite for Swift to chain the query together, but I'm stumped as to how to (or if it is even possible to) use the .order method.
let name = "John"
let filter = "%" + name + "%"
table.select(nameCOL).filter(nameCOL.like(filter)).order(nameCOL)
Gets me
SELECT name
FROM table
WHERE name LIKE %John%
ORDER BY name
Any ideas on how to add to the query to get the more advanced sorting where names that start with John come first, followed by names with John in them?
I saw the sqlite portion of the solution here: SQLite LIKE & ORDER BY Match query
Now I'd just like to implement it using SQlite for Swift
Seems it may be too restrictive for that given the limited examples, anyone else have any experience with more complicated ORDER BY clauses?
Thanks very much.
Sqlite.swift can handle that pretty cleanly with two .order statements:
let name = "John"
let filter = "%" + name + "%"
table.select(nameCol).filter(nameCol.like(filter))
.order(nameCol.like("\(name)%").desc
.order(nameCol)
The .order statements are applied in the order they are listed, the first being primary.
The filter will reduce the results to only those with "John" in them. SQlite.swift can do many complex things. I thought I would need a great deal of raw sql queries when I ported 100s of complex queries over to it, but I have yet to use raw sql.

Using bookshelf.js to query JSON column in a PostgreSQL database

I have a column in the users table of my Postgres dB table that is named email_subscription_prefs, which stores some some information in JSON format. It has an array_length of 1.
Sample Data:
[{"marketing":true,"transactional":true,"collaboration":true,"legal":true,"account":true}]
Issue:
I am trying to use bookshelf.js ORM to query and search all records in this table based on the value of the marketing key, specifically when its valueis true.
Here is an edited snippet of my code showing what I'm trying to implement this query using bookshelf.js:
return new User()
qb.where(function() {
this.where('domicile', 'USA').orWhere('domicile', null)
})
qb.whereRaw('cast(email_subscription_prefs->>? as boolean) = ?', ['marketing', true])
qb.limit(100)
})
Can someone tell me what I'm doing wrong on qb.whereRaw statement where I'm trying to query the JSON column email_subscription_prefs?
The code returns nothing where there are several thousands records in the users table.
Thanks in advance.
You seem to have an array of objects in sample data instead of single json object.
[
{
"marketing":true,
"transactional":true,
"collaboration":true,
"legal":true,
"account":true
}
]
so looks like you are trying to refer email_subscription_prefs->>'marketing' which is not found from the array.
To fetch marketing attribute of the first item in the array you should do:
email_subscription_prefs->0->>'marketing'
If that is not the problem, then you would need to add some example data from your DB to be able to tell what is the problem. You current description doesn't describe the queried table well enough.

FsSql Not working when Parameterizing Columns

Using F# , FsSql and PostGres
So I'm using this function
let getSqlParameter value =
let uniqueKey = Guid.NewGuid().ToString("N")
let key = (sprintf "#%s" uniqueKey)
(key,Sql.Parameter.make(key,value))
to get me a parameter of anything I pass in dynamically
Which I then append to a query and I get something like this
select * from (select * from mytable) as innerQuery where #a29c575b69bb4629a9971dac2808b445 LIKE '%#9e3485fdf99249e5ad6adb6405f5f5ca%'
Then I take a collection of these and pass them off
Sql.asyncExecReader connectionManager query parameters
The problem that I'm having is that when I don't run this through my parameterization engine, it works fine. When I do, it doesn't work. It just returns empty sets.
The only thing I can think of is that the column names can't be parameterized. This is a problem because they're coming from the client. Is there a way to do this?
Okay so the answer here is that you can't parameterize column names as far as I can tell.
What I ended up doing was creating a whitelist of acceptable column names and then compare what was coming in to my whitelist. If it doesn't exist then I drop it.
By far a sub-optimal solution. I really wish there was a way to do this.

How to get query.whereKey(key, containedIn: [Array]) to work with relation.query

I am querying both the local data store and the server for PFObjects. To try and save mobile data usage and networking usage, the data is first looked up in the local data store and then whatever has not been found is looked up on the server.
The code to figure out which PFObjects have not been found yet is:
let response = objects as! [PFObject]
var responseObjectIds = [String]()
for x in response {
responseObjectIds.append(x.objectId!)
}
query.whereKey("objectId", notContainedIn: responseObjectIds)
This seems to work fine with normal queries, but breaks down when trying to do the same thing with queries created from Relations.
I think I read somewhere that the whereKey method implementations are slightly different for Relation queries, but I don't think it is very well documented.
Any help improving the code or suggesting new solutions would be greatly appreciated.
The query on a relational column will be expecting a PFObject and not a string/(in this case) an array of strings, I believe.
You will need something like the following:
let relation = PFObject(withoutDataWithClassName: "yourClassName", objectId: response.objectId)
query.whereKey("objectId", notContainedIn: relation)