I need to query a function in Postgresql (14) using diesel. I already have queries that works on tables and views. This works, a view in this case.
schema.rs:
table! {
latest_readings {
measurement_time_default -> Timestamptz,
id -> Integer,
data -> Jsonb,
}
}
models.rs:
#[derive(Serialize, Queryable)]
pub struct LatestReading {
#[diesel(deserialize_as = "MyDateTimeWrapper")]
pub measurement_time_default: DateTime<Local>,
pub id: i32,
pub data: serde_json::Value,
}
controller.rs:
pub async fn get_readings(db: web::Data<Pool>) -> Result<HttpResponse, Error> {
Ok(web::block(move || db_get_readings(db))
.await
.map(|reading| HttpResponse::Ok().json(reading))
.map_err(|_| HttpResponse::InternalServerError())?)
}
fn db_get_readings(pool: web::Data<Pool>) -> Result<Vec<LatestReading>, diesel::result::Error> {
let conn = pool.get().unwrap();
latest_readings.load::<LatestReading>(&conn)
}
This won't compile. The part that call the postgresql-function.
schema.rs:
table! {
measurements_single_location_function {
id -> Integer,
name -> Text,
latitude -> Numeric,
longitude -> Numeric,
measurement_time_default -> Timestamptz,
measurements -> Jsonb,
}
}
models.rs:
#[derive(Serialize, Queryable, QueryableByName)]
#[table_name = "measurements_single_location_function"]
pub struct MeasurementsSingleLocation {
pub id: i32,
pub name: String,
pub latitude: BigDecimal,
pub longitude: BigDecimal,
#[diesel(deserialize_as = "MyDateTimeWrapper")]
pub measurement_time_default: DateTime<Local>,
pub measurements: serde_json::Value,
}
DB-query in controllers.rs:
fn db_get_measurements_single_location(
pool: web::Data<Pool>,
location_id: i32,
rows: i32,
) -> QueryResult<Vec<MeasurementsSingleLocation>> {
let conn = pool.get().unwrap(); // Error on next line
let result: QueryResult<Vec<MeasurementsSingleLocation>> =
sql_query("select * from measurements_single_location_function(1,10)")
.load::<MeasurementsSingleLocation>(&conn);
return result;
}
The compile-error:
Compiling weather_rest v0.1.0 (/Users/claus/devel/rust/vegvesen/weather_rest)
error[E0277]: the trait bound `SqlQuery: LoadQuery<_, MeasurementsSingleLocation>` is not satisfied
--> src/controller.rs:141:14
|
141 | .load::<MeasurementsSingleLocation>(&conn);
| ^^^^ the trait `LoadQuery<_, MeasurementsSingleLocation>` is not implemented for `SqlQuery`
|
= help: the following implementations were found:
<SqlQuery as LoadQuery<Conn, T>>
note: required by a bound in `load`
--> /Users/claus/.cargo/registry/src/github.com-1ecc6299db9ec823/diesel-1.4.6/src/query_dsl/mod.rs:1238:15
|
1238 | Self: LoadQuery<Conn, U>,
| ^^^^^^^^^^^^^^^^^^ required by this bound in `load`
I am not able to see what I am missing here.
Cargo.toml:
diesel = { version = "1.4.6", features = ["postgres", "uuidv07", "r2d2", "chrono", "numeric", "serde_json"] }
I recently wrote a backend-service for another project using this excellent example as a template. I applied the same structure to this and it now compiles.
schema.rs:
table! {
measurements_single_location_function {
id -> Integer,
name -> Text,
latitude -> Text,
longitude -> Text,
measurement_time_default -> Timestamptz,
measurements -> Jsonb,
}
}
routes.rs:
#[get("/measurements_single_location/{id}/{rows}")]
async fn measurements_single_location(path: web::Path<(i32, i32)>) -> Result<HttpResponse, CustomError> {
let (id, rows) = path.into_inner();
let m = MeasurementsSingleLocation::measurements_single_location(id, rows)?;
Ok(HttpResponse::Ok().json(m))
}
pub fn init_routes(config: &mut web::ServiceConfig) {
config.service(measurements_single_location);
}
models.rs:
#[derive(Serialize, QueryableByName)]
#[table_name = "measurements_single_location_function"]
pub struct MeasurementsSingleLocation {
pub id: i32,
pub name: String,
pub latitude: String,
pub longitude: String,
pub measurement_time_default: NaiveDateTime,
pub measurements: serde_json::Value,
}
impl MeasurementsSingleLocation {
pub fn measurements_single_location(id: i32, rows: i32) -> Result<Vec<MeasurementsSingleLocation>, CustomError> {
let q = "select * from measurements_single_location_function($1,$2)";
let mut conn = db::connection()?;
let m= diesel::sql_query(q)
.bind::<Integer, _>(id)
.bind::<Integer, _>(rows)
.get_results(&mut conn)?;
Ok(m)
}
}
Related
is it possible to create more generic queries using diesel?
I know how to create update or delete for every properties. But my point is create generic update for many structures.
My models:
#[derive(Associations, Identifiable, Queryable, PartialEq, Debug)]
#[diesel(belongs_to(User))]
#[diesel(table_name = books)]
pub struct Book {
pub id: i32,
pub user_id: i32,
pub title: String,
pub body: String,
pub book_description: String,
pub book_image: Option<String>,
pub publish_day: chrono::NaiveDateTime,
}
#[derive(Associations, Identifiable, Queryable, PartialEq, Debug)]
#[diesel(belongs_to(Book))]
#[diesel(table_name = book_comments)]
pub struct BookComment {
pub id: i32,
pub book_id: i32,
pub body: String,
pub publish_day: chrono::NaiveDateTime,
}
My schema:
diesel::table! {
books (id) {
id -> Int4,
user_id -> Int4,
title -> Varchar,
body -> Text,
book_description -> Varchar,
book_image -> Nullable<Varchar>,
publish_day -> Timestamp,
}
}
diesel::table! {
book_comments (id) {
id -> Int4,
book_id -> Int4,
body -> Varchar,
publish_day -> Timestamp,
}
}
Working wersion:
pub fn upate<T>(&mut self, book_id: &i32, title: &str) -> Book {
let result = diesel::update(books::table)
.filter(book::id.eq(&book_id))
.set(book::title.eq(title))
.get_result::<Book>(&mut self.connection)
.expect("Failed to update book");
result
}
I'm trying something like this:
pub fn upate<T, U, P: Table, S>(&mut self, find_by: &T, change: U, target_table: Table,change_param: &S) -> Book {
let result = diesel::update(target_table)
.filter(find_by.eq(&book_id))
.set(S.eq(change))
.get_result::<U>(&mut self.connection)
.expect("Failed to update book");
result
But of course doesn't work. Is this possible?
Little help with solution.
I did this using macro, this is a snippet from the code I used
#[macro_export]
macro_rules! get_by_id {
($table:ident,$m:ident,$pool:ident,$id:ident) => {
{
let conn = $pool.get()?;
let res = $table::table
.filter($table::columns::id.eq($id))
.get_result::<$m>(&conn)?;
Ok(res) as Result<$m, Error>
}
};
}
Edit: can be used like this:
let user = get_by_id!(schema::users, User, pool, user_id)?;
I am trying to make mongodb Document plain in order to apply okapi's open api which does not allow ObjectId in struct.
But I found neither building a find option like
FindOptions::builder().lean().build();
nor
colleciton.find(None, None).lean().await? works.
How do I transform MongoDB Document into JsonSchema?
Example
Before
{
_id: ObjectId,
name: String
}
After
{
_id: String,
name: String
}
You can create custom struct using ObjectId as base and the implement JsonSchema for that custom struct
use mongodb::bson::oid::ObjectId;
use schemars::JsonSchema;
use schemars::schema::Schema;
use schemars::schema::SchemaObject;
use schemars::gen::SchemaGenerator;
use serde::{Serialize, Deserialize};
#[derive(Debug, Serialize, Deserialize)]
pub struct ObjectID(ObjectId);
impl JsonSchema for ObjectID {
fn schema_name() -> String {
stringify!(String).to_owned()
}
fn json_schema(gen: &mut SchemaGenerator) -> Schema {
let mut schema: SchemaObject = <String>::json_schema(gen).into();
schema.number().minimum = Some(1.0);
schema.into()
}
}
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct Customer {
pub _id: ObjectID,
pub name: String
}
fn main() {
let c = Customer{_id: ObjectID(ObjectId::new()), name: String::from("John")};
println!("{:?}", c);
let serialized = serde_json::to_string(&c).unwrap();
println!("serialized = {}", serialized);
let deserialized: Customer = serde_json::from_str(&serialized).unwrap();
println!("deserialized = {:?}", deserialized);
}
// Output
// Customer { _id: ObjectID(ObjectId("6210af6079e3adc888bef5af")), name: "John" }
// serialized = {"_id":{"$oid":"6210af6079e3adc888bef5af"},"name":"John"}
// deserialized = Customer { _id: ObjectID(ObjectId("6210af6079e3adc888bef5af")), name: "John" }
I'm trying to encapsulate a Postgresql transaction and I'm running into some lifetime issue.
Here is the code :
Code
I understand the error message : "returns a value referencing data owned by the current function"
But I have no idea how I could keep my "Transaction" in my the SQLConnection structure.
use postgres::{Client, NoTls, Transaction};
pub struct SQLConnection<'a> {
client: Client,
transaction: Transaction<'a>,
}
impl<'a> SQLConnection<'a> {
pub fn new(connect_string: &str) -> Self {
let mut client = Client::connect(connect_string, NoTls).unwrap();
let transaction = client.transaction().unwrap();
Self {
client,
transaction,
}
}
pub fn commit(&self) {
let _ = self.transaction.commit();
}
pub fn rollback(&self) {
let _ = self.transaction.rollback();
}
}
You' re returning a reference to a value you don't keep. You can't have a reference if there's no owned value.
In your case, there's also no reason to keep both the client and the transaction. Your connection should wrap the client but not the transaction which is a short lived object and shouldn't be kept for more than just the operation.
Your connection should thus just be
pub struct SQLConnection {
client : Client,
}
Then you should, for an operation, get a transaction, use it, then drop it while keeping the connection.
I found a solution that suits my needs. I have now SQLConnection and SQLTransaction and I get the SQLTransation from the my connection.
Now I can use my transaction to group my elementary sql operations.
Here is the code:
use postgres::{Client, NoTls, Transaction};
pub struct SQLConnection {
client: Client,
}
impl SQLConnection {
pub fn new(connect_string: &str) -> Self {
let client = Client::connect(connect_string, NoTls).unwrap();
Self { client }
}
pub fn transaction(&mut self) -> SQLTransaction {
let t = self.client.transaction().unwrap();
SQLTransaction { transaction: t }
}
}
pub struct SQLTransaction<'a> {
transaction: Transaction<'a>,
}
impl<'a> SQLTransaction<'a> {
pub fn new(transaction: Transaction<'a>) -> Self {
Self { transaction }
}
pub fn commit(self) {
let _ = self.transaction.commit();
}
pub fn rollback(self) {
let _ = self.transaction.rollback();
}
}
I want to store the Postgres connection on global scope to access from any function in a module. Here is an example:
use postgres::{Client, NoTls};
static mut client: Option<Client> = None;
pub fn get_player(id: i32) {
// Use global client connection object:
for row in client.unwrap().query("SELECT * FROM public.\"User\" WHERE \"accountID\"=$1;",&[&id]).unwrap(){
let id: i32 = row.get(0);
let name: &str = row.get(1);
println!("found player: {} {}", id, name);
}
}
pub fn init() {
let mut connection = Client::connect("host=localhost user=postgres", NoTls);
match connection {
Ok(cli) => {
println!("Database connected.");
client = Some(cli);
}
Err(_) => println!("Database ERROR while connecting."),
}
}
It is not compiling & working as intended and I don't know how to make this in Rust.
Here is an example with lazy_static and r2d2_postgres that provides a database connection pool:
use r2d2_postgres::postgres::{NoTls, Client};
use r2d2_postgres::PostgresConnectionManager;
#[macro_use]
extern crate lazy_static;
lazy_static! {
static ref POOL: r2d2::Pool<PostgresConnectionManager<NoTls>> = {
let manager = PostgresConnectionManager::new(
// TODO: PLEASE MAKE SURE NOT TO USE HARD CODED CREDENTIALS!!!
"host=localhost user=postgres password=password".parse().unwrap(),
NoTls,
);
r2d2::Pool::new(manager).unwrap()
};
}
pub fn get_player(id: i32) {
// Use global client connection object:
let mut client = POOL.get().unwrap();
for row in client.query("SELECT * FROM public.\"User\" WHERE \"accountID\"=$1;",&[&id]).unwrap(){
let id: i32 = row.get(0);
let name: &str = row.get(1);
println!("found player: {} {}", id, name);
}
}
I am trying to execute an insert multiple columns using Diesel with PostgreSQL.
This is the insert function to add a new Project -
pub fn insert(project: NewProject, program_id: i32, conn: &PgConnection) -> bool {
use schema::projects::dsl::*;
use schema::projects::dsl::{title as t};
use schema::projects::dsl::{program_id as prog_id};
let NewProject {
title
} = project;
diesel::insert_into(projects)
.values((t.eq(title), prog_id.eq(program_id)))
.execute(conn)
.is_ok()
}
And Project and NewProject
#[derive(Queryable, Serialize, Debug, Clone)]
pub struct Project {
pub id: i32,
pub title: String,
pub program_id: i32,
pub is_archived: bool
}
#[derive(Serialize, Deserialize, Insertable)]
#[table_name = "projects"]
pub struct NewProject {
pub title: String
}
And the projects table looks like this -
CREATE TABLE projects (
id SERIAL PRIMARY KEY,
title VARCHAR NOT NULL,
program_id INTEGER NOT NULL REFERENCES programs (id),
is_archived BOOLEAN NOT NULL DEFAULT FALSE
);
and the schema.rs -
table! {
projects (id) {
id -> Int4,
title -> Varchar,
program_id -> Int4,
is_archived -> Bool,
}
When compiled I get an error saying -
title | ^^^^^ expected struct std::string::String,
found struct schema::projects::columns::title
and
.execute(conn) | ^^^^^^^ expected struct
diesel::query_source::Never, found struct
diesel::query_source::Once
I do not get a compile error when I do
.values(&project)
in the insert function instead.
Here is a MCVE of your problem:
#[macro_use]
extern crate diesel;
use diesel::pg::PgConnection;
use diesel::prelude::*;
mod schema {
table! {
projects (id) {
id -> Int4,
title -> Varchar,
program_id -> Int4,
is_archived -> Bool,
}
}
#[derive(Debug, Insertable)]
#[table_name = "projects"]
pub struct NewProject {
pub title: String,
}
}
use schema::NewProject;
fn insert(project: NewProject, program_id: i32, conn: &PgConnection) -> bool {
use schema::projects::dsl::*;
use schema::projects::dsl::{title as t};
use schema::projects::dsl::{program_id as prog_id};
let NewProject {
title
} = project;
diesel::insert_into(projects)
.values((t.eq(title), prog_id.eq(program_id)))
.execute(conn)
.is_ok()
}
fn main() {}
You have imported a type called title that conflicts with the destructuring, as the error message states:
error[E0308]: mismatched types
--> src/main.rs:34:22
|
34 | let NewProject { title } = project;
| ^^^^^ expected struct `std::string::String`, found struct `schema::projects::columns::title`
|
= note: expected type `std::string::String`
found type `schema::projects::columns::title`
This can be reduced to a very small case:
struct foo;
struct Thing { foo: String }
fn example(t: Thing) {
let Thing { foo } = t;
}
error[E0308]: mismatched types
--> src/lib.rs:5:17
|
5 | let Thing { foo } = t;
| ^^^ expected struct `std::string::String`, found struct `foo`
|
= note: expected type `std::string::String`
found type `foo`
Note that this struct is defined without curly braces, which makes it a unit-like struct. These are convenient, but they have the subtle nuance that they create both a type and a value:
struct foo;
fn example() {
let foo: foo = foo;
// ^-- the only value of the type `foo`
// ^-------- the type `foo`
// ^------------- newly-defined unrelated identifier
}
When destructuring, the pattern is preferred as a type, not an identifier.
Don't import that type and you won't have a conflict:
fn insert(project: NewProject, program_id: i32, conn: &PgConnection) -> bool {
use schema::projects::dsl;
let NewProject { title } = project;
diesel::insert_into(dsl::projects)
.values((dsl::title.eq(title), dsl::program_id.eq(program_id)))
.execute(conn)
.is_ok()
}