Rust: Convert SQL types to Rust automatically using sqlx - postgresql

I'm new to rust and was working on a generic database viewer app using SQLX. Now I've stumbled upon a problem. While querying like a simple
SELECT * FROM TABLE_NAME
Now the problem is that I don't know the type of rows and columns before hand. So how should I parse the queries?
I've tried the following from this post
for r in row {
let mut row_result: Vec<String> = Vec::new();
for col in r.columns() {
let value = r.try_get_raw(col.ordinal()).unwrap();
let value = match value.is_null() {
true => "NULL".to_string(),
false => {
let mat = value.as_str();
match mat {
Ok(m) => m.to_string(),
Err(err) => {
dbg!(err);
"ERROR".to_string()
}
}
}
};
// println!("VALUE-- {:?}", value);
row_result.push(value);
}
result.push(row_result);
}
The problem is that for some columns it's returning like this. Like for the ID columns
"\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0004"
and for some i'm getting the following error in the dbg! macro
Utf8Error {
valid_up_to: 2,
error_len: Some(
1,
),
}
Anyone can help me here?
BTW I'm using Postgres so all row types are of PgRow
2: Follow up. I was able to get the types of the columns from the information_schema but the problem seems to be that those will be in String and I couldn't find any way to convert those into rust types, like INT8 -> i64, TEXT -> String. Something like that.

Related

Problems working with rust and postgres data types

I'm triying to do an Api REST with rust and postres but I cant make it work because the relation between these two.
The actual problem is that I have a column in postgres as jsonb and when I return the data and try to save it in a struct always gives error. Same problem when I try to save the data.
This are the models.(The option is only because I'm testing thing, it should return a value)
#[derive(Debug, Serialize, Deserialize)]
pub struct CategoryView {
pub id: i32,
pub category_name: String,
pub category_custom_fields: Option<serde_json::Value>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct CategoryPayload {
pub category_name: String,
pub category_custom_fields: Option<serde_json::Value>,
}
This are the postgres queries:
fn find_all(conn: &mut DbPooled) -> Result<Vec<CategoryView>, DbError> {
let mut query = "SELECT id, category_name, category_custom_fields FROM accounting.categories".to_owned();
query.push_str(" WHERE user_id = $1");
query.push_str(" AND is_deleted = false");
let items = conn.query(&query, &[&unsafe { CURRENT_USER.to_owned() }])?;
let items_view: Vec<CategoryView> = items
.iter()
.map(|h| CategoryView {
id: h.get("id"),
category_name: h.get("category_name"),
category_custom_fields: h.get("category_custom_fields"),
})
.collect();
Ok(items_view)
}
fn add(payload: &CategoryPayload, conn: &mut DbPooled) -> Result<CategoryView, DbError> {
let mut query =
"INSERT INTO accounting.categories (user_id, category_name, category_custom_fields, create_date, update_date)"
.to_owned();
query.push_str(" VALUES ($1, $2, $3, now(), now())");
query.push_str(" RETURNING id");
let item_id = conn
.query_one(
&query,
&[
&unsafe { CURRENT_USER.to_owned() },
&payload.category_name,
&payload.category_custom_fields,
],
)?
.get(0);
let inserted_item = CategoryView {
id: item_id,
category_name: payload.category_name.to_string(),
category_custom_fields: payload.category_custom_fields,
};
Ok(inserted_item)
}
with update happens to but I think is the same solution that the one form the add function.
The error is:
the trait bound `serde_json::Value: ToSql` is not satisfied
the following other types implement trait `ToSql`:
&'a T
&'a [T]
&'a [u8]
&'a str
Box<[T]>
Box<str>
Cow<'a, str>
HashMap<std::string::String, std::option::Option<std::string::String>, H>
and 17 others
required for `std::option::Option<serde_json::Value>` to implement `ToSql`
required for the cast from `std::option::Option<serde_json::Value>` to the object type `dyn ToSql + Sync`rustcClick for full compiler diagnostic`
For what I read serde_json::Value is the equivalent to jsonb so I don't understand it.
I had a similar problem previously trying to work with a decimal value in postgres, I had to change it to integer and save the value multiplied in the database. Is a money column so maybe if you help me with that too I will change it back.
I was hopping some could explain to me how to fix it and why this happens so I can avoid have to ask for help with the datatypes in the future.
The problem was in the depencies.
It looks like some dependencies have features that add aditional functionablility.
I had installed the dependencie without any feature so when I added the features it started to work without issues.
Only had to change from:
[dependencies]
postgres = "0.19.4"
to:
[dependencies]
postgres = { version = "0.19.4", features = ["with-chrono-0_4", "with-serde_json-1"] }
Chrono for dates and serde_json for jsonb.
I'll check the decimal problem but I think will be the same solution.

Updating multiple rows using instr

I am trying to make the following call:
UPDATE MyTable SET path = ? WHERE instr(title, ?) AND start - ? < 60
However I have not been able to use instr with GRDB.
_ = try dbQueue?.write { db in
try MyTable
.filter(Column("start") > date - 60)
.filter(title.contains(Column("title")))
.updateAll(db,
Column("path").set(to: path)
)
}
How can I do this correctly? Could I also run a raw query instead? How can I fill the ? with my variables if using a raw query?
GRDB does not ship with built-in support for the instr function. You can define it in your code:
func instr(_ lhs: some SQLExpressible, rhs: some SQLExpressible) -> SQLExpression {
SQL("INSTR(\(lhs), \(rhs))").sqlExpression
}
// SELECT * FROM myTable WHERE instr(?, title)
let title: String = ...
let request = MyTable.filter(instr(title, Column("title")))
// UPDATE myTable SET path = ? WHERE instr(?, title)
let path: String = ...
try request.updateAll(db, Column("path").set(to: path))
See the How do I print a request as SQL? faq in order to control the SQL generated by GRDB.
Here is how I solved it with raw SQL in case it is too complicated to extend the framework.
I choose so, because I think this is easier to understand for someone who needs to read the code and has no experience with GRDB or frameworks in general.
do {
var dbQueue:DatabaseQueue? = try DatabaseQueue(path: "PATH_TO_DATABASE")
try dbQueue?.write { db in
try db.execute(
sql: "UPDATE MyTable SET path = :path WHERE instr(title, :title)",
arguments: ["path": path, "title": title]
)
}
} catch {
print("UPDATE MyTable \(error)")
}

Swift - Reducing a Dictionary of Arrays to a single array of same type using map/reduce/flatmap

Given an input like so:
let m = ["one": ["1","2","3"],
"two":["4","5"]
]
how can I use map/reduce to produce the output like:
["1","2","3","4","5"]
I'm not very swifty, and trying to learn it but I cant seem to figure out an efficient way to do this simple operation. My verbose approach would be like so:
var d = [String]()
for (key, value) in m {
value.forEach { (s) in
d.append(s)
}
}
print(d)
I'm sure this can be a 1 liner, could someone assist?
All you need is a single flatMap:
let result = dict.flatMap { _, values in values }

How to use a function or method on a Spark data frame column for transformation using Scala

I have created a function in scala equivalant to ORACLE DECODE function. I want to use the function with SPARK dataframes columns. I have tried it but getting multiple issues with Datatype mismatches.
I do not want to create UDF for each program. I want to create something generic and reuse it multiple times.
Function:
def ODECODE(column: Any, Param: Any*) : Any = {
var index = 0
while (index < Param.length) {
var P = Param(index)
var Q = column
if (P.equals(Q))
return Param(index + 1)
else index = index + 1
}
return Param (Param.length - 1)
}
I want to use it some thing like this:
Assuming "Emp" is a dataframe containing data from employee table with columns(first name, Last Name, Grade).
Emp.select(ODECODE("grade", "A", 1, "B", 2, "C", 3, "FAIL")).show()
This is one example. The datatype in the grade column can be String or Integer. So I have taken Datatypes in the decode function (Above) as ANY but with Dataframes it does not perform the Transformation. It gives datatype mismatches.
I want to create individual functions/Methods for some of the unsupported Oracle functions and reuse them where ever required in my transformations. So any suggestion to make this work is appreciated.
I know this is late, but I actually needed this and found your example. I was able to implement it with a few changes. I am no expert though, there may be a better way of doing this.
import util.control.Breaks._;
def ODECODE[T](column: String, params: Seq[T]) : String = {
try {
var index = 0;
breakable {
while (index < params.length) {
var P = params(index);
var Q = column;
if(P.equals(Q)) {
break;
}
index += 1;
}
}
params(index - 1).toString;
}catch {
case ife: Exception =>
ife.printStackTrace();
"0";
}
}
println(ODECODE("TEST", 0, "TEgST", 8, "***", 0))

How to insert a multidimensional array

want to save the data of the following format
{"_ibj_id":"1","url_id":'1',"url":{"0":"http://0.com","1":"http:://1.com"}}
Look at my code,
type db_list struct {
Url_id int
Url map[int]string
}
func list(table *mgo.Collection) {
var doc *goquery.Document
var e error
for i := 1628644; i > 1628643; i-- {
if doc, e = goquery.NewDocument("http://www.120ask.com/list/all/" + strconv.Itoa(i)); e != nil {
panic(e.Error())
}
var save_list db_list
save_list.Url_id = i
save_list.Url = make(map[int]string)
//fmt.Println("%s", doc.Text())
doc.Find(".q-quename").Each(func(n int, s *goquery.Selection) {
href, isTrue := s.Attr("href")
if isTrue {
save_list.Url[n] = href
fmt.Println("%D : %s", n, save_list.Url[n])
}
})
fmt.Println("%D", len(save_list.Url))
//save database
table.Insert(save_list)
}
}
The database will eventually save
Please view the picture in the annex, is to save the format of the data, save the URLvalue of the property 1
You're probably after the JSON Unmarshal function in encoding/json
{"_ibj_id":"1","url_id":'1',"url":{"0":"http://0.com","1":"http:://1.com"}} is technically invalid JSON due to the single-quotes around the url_id value( '1' should be "1") but other than that, it should map nicely to the following struct:
{
id string
url_id string
urls []string
}
But you may need to experiment with the types. According to the docs for the Unmarshal function, it will use the following Go types for each JSON type:
bool, for JSON booleans
float64, for JSON numbers
string, for JSON strings
[]interface{}, for JSON arrays
map[string]interface{}, for JSON objects
nil for JSON null
I'd highly recommend reading Andrew Gerrands Blog Post "JSON and Go".
It's unclear to me exactly what you are trying to do. One thing I notice though, is that in your desired format the keys for Url are string values where as in Go they are integers. You could try changing the type of Url to map[string]string and see if that solves your problem.