Question about Eloquent property "#with=[]" - eloquent

I have an app that uses the Eloquent dynamic array builder to create a collection:
$events=Event::with('Memory')->get()->sortByDesc('created_at');
An event looks like this:
#fillable: array:3
#connection: "mysql"
#table: "events"
#primaryKey: "id"
...
#with: []
#withCount: []
...
+wasRecentlyCreated: false
#attributes: array:7 [?
"id" => 10
"eventTitle" => "event a512345678"
"eventDescription" => "333guygflgg"
"user_id" => 1
"created_at" => "2021-08-05 13:02:25"
"updated_at" => "2021-08-05 13:02:25"
"memory_id" => null
]
...
#guarded: array:1 [?]
}
In that, I see:
#with: []
#withCount: []
For event #10, there are 3 related records.
id eventTitle mem_id memTitle
10 event 512345678 10 mem 2512345678
10 event 512345678 29 mem 5a
10 event 512345678 40 mem 6a
Shouldn't the event object have
#with: ['memories']
#withCount: ['3']
How do I fix this?

Related

Laravel 6 collections - Extract a single value from a record

I have the following collection:
$products = $this->productRepository->getByCompanyId(1)->get();
Illuminate\Database\Eloquent\Collection {#856 ▼
#items: array:1 [▼
0 => App\OrderItemProduct {#824 ▼
#table: "product"
...
#attributes: array:3 [▼
"available" => "5"
"product_name" => "Product X"
"product_id" => 1
]
....
}
]
}
I want to get the available value for a single record. I try the following which works if a record exists:
$available = $products->where('product_id','=', 1)->first()['available'];
However it throws an exception Trying to access array offset on value of type null if I try to retrieve the available value of a record that doesn't exist in the collection e.g trying to get for product_id 17:
$available = $products->where('product_id','=', 17)->first()['available'];
This was working in laravel v6 and earlier but no longer seems to work in laravel 6.17.1
Note in laravel 5.5 I was able to use the value method as follows but then that started throwing same errors as above in v 5.7 so I switched to the above method but now that no loger works either in the latest laravel version.
`available = $products->where('product_id','=', 17)->value('available');
Also when i try get why does it always return null even where record exists?
`available = $products->where('product_id','=', 1)->get('available');
Any ideas how to fix?

Importing nested JSON documents into Elasticsearch and making them searchable

We have MongoDB-collection which we want to import to Elasticsearch (for now as a one-off effort). For this end, we have exported the collection with monogexport. It is a huge JSON file with entries like the following:
{
"RefData" : {
"DebtInstrmAttrbts" : {
"NmnlValPerUnit" : "2000",
"IntrstRate" : {
"Fxd" : "3.1415"
},
"MtrtyDt" : "2020-01-01",
"TtlIssdNmnlAmt" : "200000000",
"DebtSnrty" : "SNDB"
},
"TradgVnRltdAttrbts" : {
"IssrReq" : "false",
"Id" : "BMTF",
"FrstTradDt" : "2019-04-01T12:34:56.789"
},
"TechAttrbts" : {
"PblctnPrd" : {
"FrDt" : "2019-04-04"
},
"RlvntCmptntAuthrty" : "GB"
},
"FinInstrmGnlAttrbts" : {
"ClssfctnTp" : "DBFNXX",
"ShrtNm" : "AVGO 3.625 10/16/24 c24 (URegS)",
"FullNm" : "AVGO 3 5/8 10/15/24 BOND",
"NtnlCcy" : "USD",
"Id" : "USU1109MAXXX",
"CmmdtyDerivInd" : "false"
},
"Issr" : "549300WV6GIDOZJTVXXX"
}
We are using the following Logstash configuration file to import this data set into Elasticsearch:
input {
file {
path => "/home/elastic/FIRDS.json"
start_position => "beginning"
sincedb_path => "/dev/null"
codec => json
}
}
filter {
mutate {
remove_field => [ "_id", "path", "host" ]
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "firds"
}
}
All this works fine, the data ends up in the index firds of Elasticsearch, and a GET /firds/_search returns all the entries within the _source field.
We understand that this field is not indexed and thus is not searchable, which we are actually after. We want make all of the entries within the original nested JSON searchable in Elasticsearch.
We assume that we have to adjust the filter {} part of our Logstash configuration, but how? For consistency reasons, it would not be bad to keep the original nested JSON structure, but that is not a must. Flattening would also be an option, so that e.g.
"RefData" : {
"DebtInstrmAttrbts" : {
"NmnlValPerUnit" : "2000" ...
becomes a single key-value pair "RefData.DebtInstrmAttrbts.NmnlValPerUnit" : "2000".
It would be great if we could do that immediately with Logstash, without using an additional Python script operating on the JSON file we exported from MongoDB.
EDIT: Workaround
Our current work-around is to (1) dump the MongoDB database to dump.json and then (2) flatten it with jq using the following expression, and finally (3) manually import it into Elastic
ad (2): This is the flattening step:
jq '. as $in | reduce leaf_paths as $path ({}; . + { ($path | join(".")): $in | getpath($path) }) | del(."_id.$oid") '
-c dump.json > flattened.json
References
Walker Rowe: ElasticSearch Nested Queries: How to Search for Embedded Documents
ElasticSearch search in document and in dynamic nested document
Mapping for Nested JSON document in Elasticsearch
Logstash - import nested JSON into Elasticsearch
Remark for the curious: The shown JSON is a (modified) entry from the Financial Instruments Reference Database System (FIRDS), available from the European Securities and Markets Authority (ESMA) who is an European financial regulatory agency overseeing the capital markets.

Elixir ecto: check if postgresql map column has key

To use json/jsonb data type ecto suggets to use fragments.
In my case, I've to use PostgreSQL ? operator to see if the map has such key, this however it will become something like:
where(events, [e], e.type == 1 and not fragment("???", e.qualifiers, "?", "2"))
but of course fragment reads the PostgreSQL ? as a placeholder. How can I check if the map has such key?
You need to escape the middle ? and pass a total of three arguments to fragment:
fragment("? \\? ?", e.qualifiers, "2")
Demo:
iex(1)> MyApp.Repo.insert! %MyApp.Food{name: "Foo", meta: %{price: 1}}
iex(2)> MyApp.Repo.insert! %MyApp.Food{name: "Foo", meta: %{}}
iex(3)> MyApp.Repo.all from(f in MyApp.Food, where: fragment("? \\? ?", f.meta, "price"))
[debug] SELECT f0."id", f0."name", f0."meta", f0."inserted_at", f0."updated_at" FROM "foods" AS f0 WHERE (f0."meta" ? 'price') [] OK query=8.0ms
[%MyApp.Food{__meta__: #Ecto.Schema.Metadata<:loaded>, id: 1,
inserted_at: #Ecto.DateTime<2016-06-19T03:51:40Z>, meta: %{"price" => 1},
name: "Foo", updated_at: #Ecto.DateTime<2016-06-19T03:51:40Z>}]
iex(4)> MyApp.Repo.all from(f in MyApp.Food, where: fragment("? \\? ?", f.meta, "a"))
[debug] SELECT f0."id", f0."name", f0."meta", f0."inserted_at", f0."updated_at" FROM "foods" AS f0 WHERE (f0."meta" ? 'a') [] OK query=0.8ms
[]
I'm not sure if this is documented anywhere, but I found the method from this test.

Is it possible to have requirement on data structure in Leon?

While working on rational numbers with leon, I have to add as requirement isRational pretty much everywhere.
For example:
import leon.lang._
case class Rational (n: BigInt, d: BigInt) {
def +(that: Rational): Rational = {
require(isRational && that.isRational)
Rational(n * that.d + that.n * d, d * that.d)
} ensuring { _.isRational }
def *(that: Rational): Rational = {
require(isRational && that.isRational)
Rational(n * that.n, d * that.d)
} ensuring { _.isRational }
// ...
def isRational = !(d == 0)
def nonZero = n != 0
}
Is it possible to add a require statement in a class constructor to DRY this code so that it applies to all instances of the data structure? I tried adding it on the first line of the class body but it seems to have no effect...
case class Rational (n: BigInt, d: BigInt) {
require(isRational) // NEW
// ... as before ...
def lemma(other: Rational): Rational = {
Rational(n * other.d + other.n * d, d * other.d)
}.ensuring{_.isRational}
def lemmb(other: Rational): Boolean = {
require(other.d * other.n >= 0)
this <= (other + this)
}.holds
}
This does not prevent leon from creating a Rational(0, 0) for example as the report suggest:
[ Info ] - Now considering 'postcondition' VC for Rational$$plus #9:16...
[ Info ] => VALID
[ Info ] - Now considering 'postcondition' VC for Rational$$times #14:16...
[ Info ] => VALID
[ Info ] - Now considering 'postcondition' VC for Rational$lemma #58:14...
[ Error ] => INVALID
[ Error ] Found counter-example:
[ Error ] $this -> Rational(1, 0)
[ Error ] other -> Rational(1888, -1)
[ Info ] - Now considering 'postcondition' VC for Rational$lemmb #60:41...
[ Error ] => INVALID
[ Error ] Found counter-example:
[ Error ] $this -> Rational(-974, 0)
[ Error ] other -> Rational(-5904, -1)
[ Info ] - Now considering 'precond. (call $this.<=((other + $this)))' VC for Rational$lemmb #62:5...
[ Error ] => INVALID
[ Error ] Found counter-example:
[ Error ] $this -> Rational(-1, 0)
[ Error ] other -> Rational(0, -1)
[ Info ] - Now considering 'precond. (call other + $this)' VC for Rational$lemmb #62:14...
[ Error ] => INVALID
[ Error ] Found counter-example:
[ Error ] $this -> Rational(1, 2)
[ Error ] other -> Rational(7719, 0)
[ Info ] ┌──────────────────────┐
[ Info ] ╔═╡ Verification Summary ╞═══════════════════════════════════════════════════════════════════╗
[ Info ] ║ └──────────────────────┘ ║
[ Info ] ║ Rational$$plus postcondition 9:16 valid U:smt-z3 0.010 ║
[ Info ] ║ Rational$$times postcondition 14:16 valid U:smt-z3 0.012 ║
[ Info ] ║ Rational$lemma postcondition 58:14 invalid U:smt-z3 0.011 ║
[ Info ] ║ Rational$lemmb postcondition 60:41 invalid U:smt-z3 0.018 ║
[ Info ] ║ Rational$lemmb precond. (call $this.<=((ot... 62:5 invalid U:smt-z3 0.015 ║
[ Info ] ║ Rational$lemmb precond. (call other + $this) 62:14 invalid U:smt-z3 0.011 ║
[ Info ] ╟┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄┄╢
[ Info ] ║ total: 6 valid: 2 invalid: 4 unknown 0 0.077 ║
[ Info ] ╚════════════════════════════════════════════════════════════════════════════════════════════╝
(this and other don't always meet the constructor requirement.)
Am I missing something?
The main difficulty with invariants can be decomposed in two problems:
Problem 1
Given
case class A(v: BigInt) {
require(v > 0)
}
Leon would have to inject this requirement in preconditions of all functions taking A as argument, so
def foo(a: A) = {
a.v
} ensuring { _ > 0 }
will need to become:
def foo(a: A) = {
require(a.v > 0)
a.v
} ensuring { _ > 0 }
While trivial for this case, consider the following functions:
def foo2(as: List[A]) = {
require(as.nonEmpty)
a.head.v
} ensuring { _ > 0 }
or
def foo3(as: Set[A], a: A) = {
as contains a
} ensuring { _ > 0 }
Here it is not so easy to constraint foo2 so that the list contains only valid As. Leon would have to synthesize traversal functions on ADTs so that these preconditions can be injected.
Moreover, it is impossible to specify that the Set[A] contains only valid As as Leon lacks capabilities to traverse&constraint the set.
Problem 2
While it would be practical to write the following function:
case class A(a: BigInt) {
require(invariant)
def invariant: Boolean = // ...
}
You have a chicken-and-egg issue, where invariant would be injected with a precondition checking invariant on this.
I believe both problems can be solved (or we can restrict the usage of these invariants), but they constitute the reasons why class invariants have you been trivially implemented yet.

BSON::InvalidDocument: Cannot serialize an object into BSON

I'm trying to follow along with http://mongotips.com/b/array-keys-allow-for-modeling-simplicity/
I have a Story document and a Rating document. The user will rate a story, so I wanted to create a many relationship to ratings by users as such:
class StoryRating
include MongoMapper::Document
# key <name>, <type>
key :user_id, ObjectId
key :rating, Integer
timestamps!
end
class Story
include MongoMapper::Document
# key <name>, <type>
timestamps!
key :title, String
key :ratings, Array, :index => true
many :story_ratings, :in => :ratings
end
Then
irb(main):006:0> s = Story.create
irb(main):008:0> s.ratings.push(Rating.new(user_id: '0923ksjdfkjas'))
irb(main):009:0> s.ratings.last.save
=> true
irb(main):010:0> s.save
BSON::InvalidDocument: Cannot serialize an object of class StoryRating into BSON.
from /usr/local/lib/ruby/gems/1.9.1/gems/bson-1.6.2/lib/bson/bson_c.rb:24:in `serialize' (...)
Why?
You should be using the association "story_rating" method for your push/append rather than the internal "rating" Array.push to get what you want to follow John Nunemaker's "Array Keys Allow For Modeling Simplicity" discussion. The difference is that with the association method, MongoMapper will insert the BSON::ObjectId reference into the array, with the latter you are pushing a Ruby StoryRating object into the Array, and the underlying driver driver cant serialize it.
Here's a test that works for me, that shows the difference. Hope that this helps.
Test
require 'test_helper'
class Object
def to_pretty_json
JSON.pretty_generate(JSON.parse(self.to_json))
end
end
class StoryTest < ActiveSupport::TestCase
def setup
User.delete_all
Story.delete_all
StoryRating.delete_all
#stories_coll = Mongo::Connection.new['free11513_mongomapper_bson_test']['stories']
end
test "Array Keys" do
user = User.create(:name => 'Gary')
story = Story.create(:title => 'A Tale of Two Cities')
rating = StoryRating.create(:user_id => user.id, :rating => 5)
assert_equal(1, StoryRating.count)
story.ratings.push(rating)
p story.ratings
assert_raise(BSON::InvalidDocument) { story.save }
story.ratings.pop
story.story_ratings.push(rating) # note story.story_ratings, NOT story.ratings
p story.ratings
assert_nothing_raised(BSON::InvalidDocument) { story.save }
assert_equal(1, Story.count)
puts Story.all(:ratings => rating.id).to_pretty_json
end
end
Result
Run options: --name=test_Array_Keys
# Running tests:
[#<StoryRating _id: BSON::ObjectId('4fa98c25e4d30b9765000003'), created_at: Tue, 08 May 2012 21:12:05 UTC +00:00, rating: 5, updated_at: Tue, 08 May 2012 21:12:05 UTC +00:00, user_id: BSON::ObjectId('4fa98c25e4d30b9765000001')>]
[BSON::ObjectId('4fa98c25e4d30b9765000003')]
[
{
"created_at": "2012-05-08T21:12:05Z",
"id": "4fa98c25e4d30b9765000002",
"ratings": [
"4fa98c25e4d30b9765000003"
],
"title": "A Tale of Two Cities",
"updated_at": "2012-05-08T21:12:05Z"
}
]
.
Finished tests in 0.023377s, 42.7771 tests/s, 171.1084 assertions/s.
1 tests, 4 assertions, 0 failures, 0 errors, 0 skips