I am store some tags as a jsonb datatype in PostgreSQL 13, this is the table DDL:
-- Drop table
-- DROP TABLE public.test;
CREATE TABLE public.test (
id int8 NOT NULL GENERATED ALWAYS AS IDENTITY,
tags jsonb NOT NULL,
CONSTRAINT test_pkey PRIMARY KEY (id)
);
now I want to do a query using jsonb in rust diesel diesel = { version = "1.4.7", features = ["postgres","serde_json"] }, this is the rust main.rs:
#[macro_use]
extern crate diesel;
use diesel::pg::types::sql_types::Jsonb;
use diesel::{ExpressionMethods, QueryDsl, RunQueryDsl};
use rocket::serde::Deserialize;
use rocket::serde::Serialize;
use rust_wheel::config::db::config::establish_connection;
pub mod model;
fn main() {
use crate::model::diesel::dict::dict_schema::test::dsl::*;
let connection = rust_wheel::config::db::config::establish_connection();
let predicate = crate::model::diesel::dict::dict_schema::test::tags.eq(serde_json::from_value("value".parse().unwrap()));
let db_admin_user = test.filter(&predicate)
.limit(1)
.load::<Test>(&connection)
.expect("query test failed");
}
#[derive(Queryable,Debug,Serialize,Deserialize,Default)]
pub struct Test {
pub id: i64,
pub tags: serde_json::Value,
}
and this is the schema file:
table! {
test (id) {
id -> Int8,
tags -> Jsonb,
}
}
when I compile the code, shows error like this:
error[E0277]: the trait bound `Result<_, serde_json::Error>: diesel::Expression` is not satisfied
--> src/main.rs:15:76
|
15 | let predicate = crate::model::diesel::dict::dict_schema::test::tags.eq(serde_json::from_value("value".parse().unwrap()));
| -- ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ the trait `diesel::Expression` is not implemented for `Result<_, serde_json::Error>`
| |
| required by a bound introduced by this call
|
= note: required because of the requirements on the impl of `AsExpression<diesel::sql_types::Jsonb>` for `Result<_, serde_json::Error>`
what should I do to query the record using jsonb data type?
Related
I'm creating my model using Actix-Web framework and sqlx library to make all sql queries with postgresql.
My problem is that I'm creating my model and when I query to get all the rows from a table, it gives me an error in the 'created_at' column.
The error I get is:
'optional feature time required for type TIMESTAMPTZ of column #4 ("created_at")'
And my attempts have been to change my table creation to avoid this error also the model declaration and haven't got any luck. I got rid of the "created_at" & "updated_at" and the error went away so I know it has to be with those variables declarations in particular.
table creation:
CREATE TABLE IF NOT EXISTS fields (
"id" uuid PRIMARY KEY,
"name" varchar NOT NULL,
"address" varchar NOT NULL,
"created_at" TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
"updated_at" TIMESTAMP WITH TIME ZONE DEFAULT NOW()
);
I have also tried using TIMESTAMPZ and also it didn't work.
// field_model.rs
use serde::{Deserialize, Serialize};
use sqlx::FromRow;
use uuid::Uuid;
#[derive(Debug, FromRow, Deserialize, Serialize)]
#[allow(non_snake_case)]
pub struct FieldModel {
pub id: Uuid,
pub name: String,
pub address: String,
pub published: Option<bool>,
#[serde(rename = "createdAt")]
pub created_at: Option<chrono::DateTime<chrono::Utc>>,
#[serde(rename = "updatedAt")]
pub updated_at: Option<chrono::DateTime<chrono::Utc>>,
}
And this is my route handler for the field GET/fields end-point
// field_route.rs
#[get("/api/games")]
pub async fn get_games(opts: web::Query<FilterOptions>,data: web::Data<AppState>) -> impl Responder {
let query_result = sqlx::query_as!(
FieldModel,
"SELECT * FROM fields",
)
.fetch_all(&data.db)
.await;
if query_result.is_err() {
let message = "Something bad happened while fetching all not items";
return HttpResponse::InternalServerError()
.json(json!({"status": "error", "message": message}));
}
let fields = query_result.unwrap();
let json_response = serde_json::json!({
"status":"success",
"results": fields.len(),
"fields": fields
});
HttpResponse::Ok().json(json_response)
}
This is my Cargo.toml in case you want to see the libraries.
[package]
name = "api_service"
version = "0.1.0"
edition = "2021"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
actix = "0.13.0"
actix-cors = "0.6.4"
actix-web = "4"
chrono = {version = "0.4.23", features = ["serde"]}
dotenv = "0.15.0"
env_logger = "0.10.0"
serde = { version = "1.0.145", features = ["derive"]}
serde_json = "1.0.86"
sqlx = {version = "0.6.2", features = ["runtime-async-std-native-tls", "postgres", "uuid"]}
uuid = { version = "1.2.2", features = ["serde", "v4"] }
Any help will be appreciated, thanks.
Turns out it was the sqlx library that is the one that was giving me trouble
Just had to add the time functionality that comes with the "chrono" library but from the crate sqlx.
Cargo.toml dependency then looks like this:
sqlx = { version = "0.6.2", features = ["runtime-async-std-native-tls", "postgres", "uuid", "chrono"] }
After that the problem is solved and I can run my api properly.
I'm working on a store management API with rust, actix-web and diesel with postgres and i have an issue with the mapping of the result of my sql_query.
I have the following structs representing my tables :
#[derive(Identifiable, Queryable, Serialize, Deserialize, Debug, Clone, QueryableByName)]
#[diesel(table_name = stores)]
pub struct Store {
pub id: i32,
pub name: String,
pub is_holiday: bool,
pub created_at: NaiveDateTime,
}
#[derive(Identifiable, Queryable, Validate, Associations, Serialize, Deserialize, Debug, Clone, QueryableByName)]
#[diesel(table_name = products, belongs_to(Store))]
pub struct Product {
pub id: i32,
pub name: String,
pub i18n_name: Option<String>,
pub price: BigDecimal,
pub description: Option<String>,
pub i18n_description: Option<String>,
pub created_at: NaiveDateTime,
pub store_id: Option<i32>,
}
diesel::table! {
products (id) {
id -> Int4,
name -> Varchar,
i18n_name -> Nullable<Varchar>,
price -> Numeric,
description -> Nullable<Text>,
i18n_description -> Nullable<Text>,
created_at -> Timestamp,
store_id -> Nullable<Int4>,
}
}
diesel::table! {
stores (id) {
id -> Int4,
name -> Varchar,
is_holiday -> Bool,
created_at -> Timestamp,
}
}
and the following function (i'll just put the relevant code snippet because the function is rather long) :
fn get_many() {
...
let mut db_query_one = String::from("SELECT distinct p.id, p.name, p.i18n_name, p.description, p.i18n_description, p.price, p.store_id, p.created_at, s.id, s.created_at, s.is_holiday, s.name from products p left join stores s on s.id = p.store_id");
let db_query_two = format!(" left join products_categories pc on pc.product_id = p.id WHERE p.name ILIKE $1 OR p.description ILIKE $2 ORDER BY p.{} LIMIT $3 OFFSET $4", order.stringify());
db_query_one.push_str(&db_query_two);
let res = sql_query(db_query_one)
.bind::<Text,_>(search.get_name())
.bind::<Text,_>(search.get_description())
.bind::<Integer,_>(pagination.get_per_page())
.bind::<Integer,_>((pagination.get_page() - 1) * pagination.get_per_page())
.load::<(Product, Option<Store>)>(&mut conn);
...
}
I expect this function to return a vector of tuples of products and the store each product is attached to if any.
I almost get that, except that when a product is attached to a store, the store id, name and created_at fields are gonna be mapped to the products table respective id, name and created_at columns. how do i fix that and make the stores table columns map to the Store struct?
I'm sorry if i missed anything important i'm relatively new to rust.
For diesel::sql_query diesel loads fields by column name. This implies that the column name must be unique. You can assign such a unique column name in your query by using expr as your_name. You need to apply these column name in your type via the #[diesel(column_name = "your_name")] attribute on the corresponding field as well.
I am trying to get the average value of a column using rust diesel but am stuck with the type error.
Error:
the trait bound `f64: FromSql<Numeric, _>` is not satisfied
the following other types implement trait `FromSql<A, DB>`:
<f32 as FromSql<diesel::sql_types::Float, DB>>
<f64 as FromSql<Double, DB>>
<i16 as FromSql<SmallInt, DB>>
<i32 as FromSql<Integer, DB>>
<i64 as FromSql<BigInt, DB>>
<u32 as FromSql<Oid, Pg>>
required because of the requirements on the impl of `diesel::Queryable<Numeric, _>` for `f64`
Code:
let new_avg: Option<f64> = fruits
.select(avg(weight))
.filter(fruit_name.eq(&fruit_name))
.get_result::<Option<f64>>(&self.postgres.get().unwrap())
.unwrap();
The problem seems to be that you are trying to cast type Numeric from postgres to f64 in Rust, which does not have implementation.
I tried to reproduce your case so I created table like so:
CREATE TABLE fruits (
id SERIAL PRIMARY KEY,
value NUMERIC NOT NULL
)
for which diesel generated for me this in schemas:
table! {
fruits (id) {
id -> Int4,
value -> Numeric,
}
}
and in models I created Fruit:
#[derive(Queryable, Debug)]
pub struct Fruit {
pub id: i32,
pub value: f64
}
Right now when I try to run this:
let results = fruits.load::<Fruit>(&pg_connection).expect("");
I'm getting the same error as you which we can solve in few ways.
If you want to keep type f64 in Rust then you can change in table creation that value should have type DOUBLE PRECISION which after running diesel migration will generate Float8 type in schema which has it's implementation mentioned in error:
= help: the following implementations were found:
<f64 as FromSql<Double, DB>>
If you want to keep type Numeric in postgres table you can try using bigecimal::BigDecimal or diesel::pg::data_types::PgNumeric as type of value in Fruit struct since there is also implementation for casting Numeric to PgNumeric.
If you want to keep both you probably have to implement it on you own
We have foreign keys within a json blob in postgres. We join with these like so:
SELECT f.id, b.id FROM foo AS f
LEFT JOIN bar AS b ON f.data -> 'baz' ->> 'barId' = text(b.id)
I'm now trying out Hasura to do som graphql queries and I need these as object relationships. In the UI I can only try to manually add relationships with normal columns, not nested json data:
Is it at all possible to get a graphql relationship this way?
I got the answer in comments, thanks #iamnat. I'll just evolve here with my example for clarity since I still struggled a bit:
Super simple schema and data as such:
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
CREATE TABLE foo
(
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
name text,
data jsonb NOT NULL
);
CREATE TABLE bar
(
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
name text,
);
WITH bars AS (
INSERT INTO bar (name) VALUES ('bar') RETURNING id
)
INSERT INTO foo (name, data) VALUES ('foo', jsonb_build_object('barId', (SELECT id FROM bars)));
I then can create a function for the relationship:
CREATE FUNCTION foo_bar(foo_row foo)
RETURNS SETOF bar AS $$
SELECT *
FROM bar
WHERE text(id) = foo_row.data ->> 'barId'
$$ LANGUAGE sql STABLE;
This I can then use in Hasura as a computed field under "Data" -> foo -> Modify -> Computed fields -> "Add a new computed field". Just give it a name and reference the function in a dropdown:
I can then query:
query MyQuery {
foo {
name
foo_bar {
name
}
}
}
with expected result:
{
"data": {
"foo": [
{
"name": "foo",
"foo_bar": [
{
"name": "bar"
}
]
}
]
}
}
My Package.swift file looks like:
.package(url: "https://github.com/vapor/vapor.git", from: "4.0.0-rc"),
.package(url: "https://github.com/vapor/fluent.git", from: "4.0.0-rc"),
.package(url: "https://github.com/vapor/fluent-postgres-driver.git", from: "2.0.0-rc")
My config.swift file looks like:
app.databases.use(.postgres(
hostname: Environment.get("DATABASE_HOST") ?? "localhost",
username: Environment.get("DATABASE_USERNAME") ?? "postgres",
password: Environment.get("DATABASE_PASSWORD") ?? "secret",
database: Environment.get("DATABASE_NAME") ?? "mydb2"
), as: .psql)
My model looks like:
import Fluent
import Vapor
final class Complaint: Model, Content {
static let schema = "cw_complaint5"
#ID(key: .id)
var id: UUID?
#Field(key: "issue_description")
var issue_description: String
// etc.
init() { }
}
In Xcode, the project builds OK, and this GET route runs OK:
http://localhost:8080/complaint
But when I run the POST route, I get an error in the response body:
{
"error": true,
"reason": "Value of type 'UUID' required for key 'id'."
}
In Vapor 3, the POST route worked fine for both inserting a new row (omit the ID in the JSON request body) and updating an existing row. The table description on PostgreSQL 11.3 looks like:
mydb2=# \d+ cw_complaint5
Table "public.cw_complaint5"
Column | Type | Collation | Nullable | Default | Storage | Stats target | Description
----------------------+-------------------------+-----------+----------+---------+----------+--------------+-------------
id | integer | | not null | | plain | |
row | smallint | | | | plain | |
document_number | integer | | not null | | plain | |
...etc...
county_id | integer | | | | plain | |
city_id | integer | | | | plain | |
operator_id | integer | | | | plain | |
Indexes:
"cw_complaint5_pkey" PRIMARY KEY, btree (id)
Check constraints:
"cw_complaint5_row_check" CHECK ("row" >= 0)
Foreign-key constraints:
"cw_complaint5_city_id_fkey" FOREIGN KEY (city_id) REFERENCES cw_city5(id)
"cw_complaint5_county_id_fkey" FOREIGN KEY (county_id) REFERENCES cw_county5(id)
"cw_complaint5_operator_id_fkey" FOREIGN KEY (operator_id) REFERENCES cw_operator5(id)
(END)
The request body of the failiing POST request (update existing row) looks like:
{"id":729,"issue_description":"test","document_number":99,"api_state_code":88,"api_county_code":11,"section":22,"city_id":51,"county_id":56,"operator_id":4415}
Your issue here is that the Complaint model's ID type is UUID, but in your POST request body, you are passing in an Int value. Since you are migrating an existing project that already has a database defined with integer IDs, the best solution for this is to change your model's ID type.
Because Fluent 4 attempts to support all databases, by default it only supports UUID ID types. You can have a custom ID type, like this:
#ID(custom: "id", generatedBy: .database) var id: Int?
This will allow your model to work with your current database table structure and take in integer IDs in your POST request body.
Your model will look like this in the end:
final class Complaint: Model, Content {
static let schema = "cw_complaint5"
#ID(custom: "id", generatedBy: .database)
var id: Int?
#Field(key: "issue_description")
var issue_description: String
// etc.
init() { }
}
Solution: serialize the table in PostgreSQL:
mydb2=# ALTER TABLE cw_complaint5 DROP COLUMN id;
mydb2=# ALTER TABLE cw_complaint5 ADD COLUMN id SERIAL PRIMARY KEY;
Then to insert a row (in xcode 11.4 vapor 4 fluentpostgresql) remove id from your POST request json e.g.:
{"section":1,"operator_id":4415 ... etc.}
To update an existing row, include the id in the json. The following in routes.swift handles both insert and update:
app.post("complaint") { req -> EventLoopFuture<Complaint> in
let complaint = try req.content.decode(Complaint.self)
if let id = complaint.id {
print("60p \(id)") // Optional(6005)
}
return complaint.save(on: req.db)
.map { complaint }
}