I am having a very weird issue when trying to save a data in Mongodb by the Rust driver. This is my struct
#[derive(Deserialize, Serialize, Debug)]
struct Info {
#[serde(rename = "_id", skip_serializing_if = "Option::is_none")]
id: Option<bson::oid::ObjectId>,
name: String,
created_at: i64,
updated_at: i64,
}
And this is my actix route handler function
async fn post_request(info: web::Json<Info>, data: web::Data<State>) -> impl Responder {
let name: &str = &info.name;
let document = Info {
id: None,
name: name.to_string(),
created_at: Utc::now().timestamp_millis(),
updated_at: Utc::now().timestamp_millis(),
};
// Convert to a Bson instance:
let serialized_doc = bson::to_bson(&document).unwrap();
let doc = serialized_doc.as_document().unwrap();
let collection = data.client.database("test1").collection("users");
let result = collection.insert_one(doc.to_owned(), None).await.unwrap();
HttpResponse::Ok().json(result).await
}
I am getting the Utc struct by chrono crate.
When i am trying to save the data in MongoDB by hitting the route, It doesn't gets saved. But, oddly, when i comment out the created_at and updated_at in struct and handler, it gets saved. If i don't use structs and try to save it as a raw document, by storing created_at and updated_at in variables, then also it gets saved, but not by structs. I am new to rust so maybe i am doing something wrong. Please help
Related
In this way it works, I go to retrieve the field of that document (in this case the extranote), but what if I would only be interested in that field of the last document inserted chronologically?
I looked for similar examples but I wouldn't want just a few options, which I obviously don't know, to add when I create the db.
This model was created to read directly into the database without going through the App:
struct WorkoutModel: Identifiable,Codable {
//var id = UUID().uuidString
#DocumentID var id : String?
var extranote : String
var error: String = ""
}
struct SimpleEntry: TimelineEntry {
let date: Date
var deviceCount: Int
var deviceMake: String
var deviceModel: String
var deviceType: String
let configuration: ConfigurationIntent
var workoutData: WorkoutModel?
}
func fetchFromDB(completion: #escaping (WorkoutModel)->()){
let db = Firestore.firestore().collection("devices").document("lEp5impGTGeBmAEisQT")
db.getDocument { snap, err in
guard let doc = snap?.data() else {
completion(WorkoutModel(extranote: "", error: err?.localizedDescription ?? ""))
return
}
let extranote = doc["extranote"] as? String ?? ""
completion(WorkoutModel(extranote: extranote))
}
}
func getTimeline(for configuration: ConfigurationIntent, in context: Context,
completion: #escaping (Timeline<Entry>) -> ()) {
var entries: [SimpleEntry] = []
// Generate a timeline consisting of five entries an hour apart, starting
from the current date.
let date = Date()
let nextUpdate = Calendar.current.date(byAdding: .minute, value: 5, to:date)!
fetchFromDB { work in
let entry = SimpleEntry(date: nextUpdate, deviceCount:
deviceCount,deviceMake: deviceMake,deviceModel: deviceModel,deviceType:
deviceType, configuration: configuration, workoutData: work)
entries.append(entry)
let timeline = Timeline(entries: entries, policy: .atEnd)
completion(timeline)
}
}
If you don't know the ID of the document to update, then you will need to query the collection for the document. If you don't have a field in that document that indicates the timestamp of its last update, then you won't be able to make that query at all. So, you will have to add a field to the document, and make sure it's populated with the current timestamp every time you write it. Then you can use that field to make the query if you want to update it again later.
I am making a graphql resolver in rust, and am only fetching the fields from the graphql query in my mongodb database. However Rust complains that the fetched data, of course, is now not of the same type as the specified return type. What is the right way to do something like this.
I guess I could do #[serde(default)], but that doesn't work exactly as expected (I will explain later)
use async_graphql::*;
use serde::{Deserialize, Serialize};
use mongodb::{bson::doc, bson::oid::ObjectId, options::FindOptions, Collection};
#[derive(SimpleObject, Serialize, Deserialize, Debug)]
#[graphql(complex)]
struct Post {
#[serde(rename = "_id")]
pub id: ObjectId,
pub title: String,
// I could do something like
// #[serde(default)]
pub body: String,
}
#[ComplexObject]
impl Post {
async fn text_snippet(&self) -> &str {
let length = self.body.len();
let end = min(length, 5);
&self.body[0..end]
}
}
struct Query;
#[Object]
impl Query {
// fetching posts
async fn posts<'ctx>(&self, ctx: &Context<'ctx>) -> Vec<Post> {
let posts = ctx.data_unchecked::<Collection<Post>>();
let projection = // getting the projection doc here based on graphql fields, lets say doc! {"title": 1}
let options = FindOptions::builder().limit(10).projection(projection).build();
let cursor = posts.find(None, options).await.unwrap();
cursor.try_collect().await.unwrap_or_else(|_| vec![])
}
}
But when I run the query
{
posts {
id
title
textSnippet
}
}
i get
thread 'actix-rt:worker:0' panicked at 'called `Result::unwrap()` on an `Err` value: Error { kind: BsonDecode(DeserializationError { message: "missing field `body`" }), labels: [] }', server/src/schema/post.rs:20:46
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
and when i do the #[serde(default)] stuff on body, and I then query textSnippet and not body, the textSnippet is an empty string.
How do i fix this?
Could you wrap every field in Post with an Option and let the try_collect fill the returned fields for you?
You can create a struct with those fileds you need and use a collection of the new struct.
use async_graphql::*;
use serde::{Deserialize, Serialize};
use mongodb::{bson::doc, bson::oid::ObjectId, options::FindOptions, Collection};
#[derive(SimpleObject, Serialize, Deserialize, Debug)]
#[graphql(complex)]
struct Post {
#[serde(rename = "_id")]
pub id: ObjectId,
pub title: String,
// I could do something like
// #[serde(default)]
pub body: String,
}
#[derive(SimpleObject, Serialize, Deserialize, Debug)]
#[graphql(complex)]
struct PostTitle {
#[serde(rename = "_id")]
pub id: ObjectId,
pub title: String,
}
struct Query;
#[Object]
impl Query {
// fetching posts
async fn posts<'ctx>(&self, ctx: &Context<'ctx>) -> Vec<PostTitle> {
let posts = ctx.data_unchecked::<Collection<PostTitle>>();
let projection = doc! {"title": 1}
let options = FindOptions::builder().limit(10).projection(projection).build();
let cursor = posts.find(None, options).await.unwrap();
cursor.try_collect().await.unwrap_or_else(|_| vec![])
}
}
I have a struct to model an Item. But some of its field depends of other struct. And I want to save this nested object into mongodb with MongoDB Rust Driver. (https://github.com/mongodb/mongo-rust-driver)
use mongodb::bson::doc;
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Debug)]
struct CustomUnit {
pub unit: String,
pub multiplier: f64,
}
// Item depends on CustomUnit struct above. Nested object, JSON-like
struct Item {
pub name: String,
pub qty: f64,
pub unit: CustomUnit ,
}
// declare an item and an unit
let my_unit = CustomUnit {unit: "BOX".to_string(), multiplier: 12.0};
let a = Item {name: "FOO Item".to_string(), qty: 10.0, unit: my_unit};
// later in the code I extracted the value for each field
let name = a.name.clone();
let qty = a.qty;
let unit = a.unit;
let doc = doc! {
"name": name.clone(),
"qty": qty,
"unit": unit,
};
// throws an error: "the trait `From<CustomUnit>` is not implemented for `Bson`"
db.collection(COLL).insert_one(doc, None).await
this displays an error message:
_^ the trait `From<CustomUnit>` is not implemented for `Bson`
= help: the following implementations were found:
<Bson as From<&T>>
<Bson as From<&[T]>>
<Bson as From<&str>>
<Bson as From<Regex>>
and 19 others
= note: required by `std::convert::From::from`
= note: this error originates in a macro (in Nightly builds, run with -Z macro-backtrace for more info)
How to implement this From<CustomUnit> for Bson trait ?
impl From<CustomUnit> for Bson {
fn from(unit: CustomUnit) -> Self {
// what to do here ? and how to return a Bson document ?
// there's no explanation in rust-mongodb manual
}
}
Since doc! internally converts each field into Binary JSON (BSON). doc! converts known types into BSON automatically, but since unit is a user made struct, it doesn't know how to.
use mongodb::bson;
#[derive(Serialize, Deserialize, Debug)]
struct CustomUnit {
pub unit: String,
pub multiplier: f64,
}
let doc = doc! {
"name": name.clone(),
"qty": qty,
"unit": bson::to_bson(&unit).unwrap(),
};
I'm using the rust-postgres crate to ingest data. This is a working example adding rows successfully:
let name: &str = "hello from rust";
let val: i32 = 123;
let now: DateTime<Utc> = Utc::now();
let timestamp = now.format("%Y-%m-%dT%H:%M:%S%.6f").to_string();
client.execute(
"INSERT INTO trades VALUES(to_timestamp($1, 'yyyy-MM-ddTHH:mm:ss.SSSUUU'),$2,$3)",
&[×tamp, &name, &val],
)?;
This doesn't look so nice as I have to do this forward and back string conversion, I would like to be able to write something like
let name: &str = "hello from rust";
let val: i32 = 123;
let now: DateTime<Utc> = Utc::now();
client.execute(
"INSERT INTO trades VALUES($1,$2,$3)",
&[&now, &name, &val],
)?;
What's the most performant way of ingesting timestamps in this way?
Edit:
Here's the returned error from the second example above
Error: Error { kind: ToSql(0), cause: Some(WrongType { postgres: Timestamp, rust: "chrono::datetime::DateTime<chrono::offset::utc::Utc>" }) }
And my cargo.toml looks like this (which has the chrono feature enabled for the rust postgres crate):
[dependencies]
chrono = "0.4.19"
postgres={version="0.19.0", features=["with-serde_json-1", "with-bit-vec-0_6", "with-chrono-0_4"]}
I think the problem is a mismatch between your postgres schema and your Rust type: the error seems to say that your postgres type is timestamp, while your rust type is DateTime<Utc>.
If you check the conversion table, DateTime<Utc> converts to a TIMESTAMP WITH TIME ZONE. The only types which convert to TIMESTAMP are NaiveDateTime and PrimitiveDateTime.
As per Masklinn's response, I needed to pass a NaiveDateTime type for this to work, the full example with naive_local looks like:
use postgres::{Client, NoTls, Error};
use chrono::{Utc};
use std::time::SystemTime;
fn main() -> Result<(), Error> {
let mut client = Client::connect("postgresql://admin:quest#localhost:8812/qdb", NoTls)?;
// Basic query
client.batch_execute("CREATE TABLE IF NOT EXISTS trades (ts TIMESTAMP, date DATE, name STRING, value INT) timestamp(ts);")?;
// Parameterized query
let name: &str = "rust example";
let val: i32 = 123;
let utc = Utc::now();
let sys_time = SystemTime::now();
client.execute(
"INSERT INTO trades VALUES($1,$2,$3,$4)",
&[&utc.naive_local(), &sys_time, &name, &val],
)?;
// Prepared statement
let mut txn = client.transaction()?;
let statement = txn.prepare("insert into trades values ($1,$2,$3,$4)")?;
for value in 0..10 {
let utc = Utc::now();
let sys_time = SystemTime::now();
txn.execute(&statement, &[&utc.naive_local(), &sys_time, &name, &value])?;
}
txn.commit()?;
println!("import finished");
Ok(())
}
Datetime fields in structs are being serialized to Strings instead of ISODates when using the Rust Mongo driver prototype. How do I get the fields to be saved as ISODate?
use chrono::{DateTime, Utc};
use mongodb::oid::ObjectId;
use mongodb::{
coll::Collection, db::Database, db::ThreadedDatabase, error::Error, Client, ThreadedClient,
};
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Clone, Debug)]
struct Person {
pub _id: ObjectId,
pub date: DateTime<Utc>,
}
fn main() {
let client = Client.with_uri("mongodb://localhost:27017").unwrap();
let p = Person {
_id: ObjectId::new().unwrap(),
date: Utc::now(),
};
let document = mongodb::to_bson(p).unwrap().as_document();
if document.is_some() {
client
.db("my_db")
.collection("mycollection")
.insert_one(document, None)
.unwrap();
}
}
On querying the DB, the record contains a date string (in ISO format); I expected it to be an ISODate.
You can choose deserialization as ISO string with serde_helpers.
https://docs.rs/bson/1.2.2/bson/serde_helpers/index.html
use mongodb::bson::DateTime;
use mongodb::bson::serde_helpers::bson_datetime_as_iso_string;
#[derive(Serialize, Deserialize, Clone, Debug)]
struct Person {
pub _id: ObjectId,
#[serde(with = "bson_datetime_as_iso_string")]
date: DateTime,
}
With mongodb and bson crates in version 2.0.0-beta.1
In Cargo.toml, add the feature chrono-0_4:
bson = { version = "2.0.0-beta.1", features = ["chrono-0_4"] }
Then annotate your field with
use chrono::{DateTime, Utc};
#[serde(with = "bson::serde_helpers::chrono_datetime_as_bson_datetime")]
date: DateTime<Utc>,