'validates_uniqueness_of' - how to get error message - mongodb

I have model:
# encoding: utf-8
class Tag
include Mongoid::Document
field :name, type: String
field :count, type: Integer
index :name, unique: true
validates_uniqueness_of :name
def self.create_tag(name)
tag = Tag.new
tag.name = name
tag.count = 0
tag.save
end
def self.find_by_name(name)
Tag.where(name: name).entries
end
end
And I have test for the model:
describe Tag, "# simple database operations" do
it " - insert test records" do
Tag.create_tag("joe")
Tag.create_tag("joe")
p Tag.find_by_name("joe")
end
end
If I look at collection after test execution I'll find only one record, but I want to catch exception in the case of duplicate record insertion.
Is it possible?

By default Mongoid writes in "fire and forget" mode. It sends a write and returns immediately. To check for error, you should write in "safe mode". Try this.
def self.create_tag(name)
tag = Tag.new
tag.name = name
tag.count = 0
tag.safely.save! # <= note the 'safely' here. Also bang version of save is used.
end
Or, better yet,
def self.create_tag(name)
Tag.safely.create!(name: name, count: 0)
end
See the doc here.

Related

ActiveRecord select columns where one column is an array of a single property from a relation

class Building < ActiveRecord::Base
# id
# name
# url
has_many :floors
end
class Floor < ActiveRecord::Base
# id
# building_id
# name
# number
belongs_to :building
end
I want to add a method to the Building model that returns the following:
def self.some_stuff
[{
id: "81b0bd96-20e9-4f01-9801-59f3d9b99735",
url: "http://www.google.com",
numbers: [4,5,6,7,8,9]
},
{
id: "5de096dd-f282-41cc-8300-b972fb61ea41",
url: "http://www.yahoo.com",
numbers: [2,7,11]
}]
end
Ideally I would like to make this chainable so that I could do the following:
Building.all.limit(100).some_stuff
EDIT 1:
Here is how what I am trying to get at looks like in postgresql:
SELECT
B.id,
B.url,
array_agg(F.floor_number) AS numbers
FROM
buildings AS B
LEFT OUTER JOIN floors AS F ON
F.building_id = B.id
GROUP BY
B.uuid;
EDIT 2:
The following gives me the data I want but in a really crappy format (because of pluck). If I change pluck to select it breaks.
def self.some_stuff
fields = "buildings.id, buildings.url"
joins(:floors)
.includes(:floors)
.group(fields)
.pluck("#{fields}, array_agg(floors.floor_number)")
end
You likely want your some_stuff method to look something like this:
class Building
def self.some_stuff
map do |building|
{
id: building.id,
url: building.url,
numbers: building.floors.pluck(:number)
}
end
end
end
def self.some_stuff
includes(:floors).map do |b|
{
id: b.id,
url: b.url,
numbers: b.floors.map(&:number)
}
end
end
Slight modification of Josh's answer. Added the includes and changed pluck to map to avoid n+1 queries.

Ecto, Phoenix: How to update a model with an embeds_many declaration?

I have two models, Ownerand Property, where the schema for Ownerhas an embeds_many declaration, like this:
defmodule MyApp.Owner do
use MyApp.Web, :model
alias MyApp.Property
schema "owners" do
field name, :string
embeds_many :properties, Property
timestamps()
end
#doc """
Builds a changeset based on the `struct` and `params`.
"""
def changeset(struct, params \\ %{}) do
struct
|> cast(params, [])
|> validate_required([])
end
end
and this:
defmodule MyApp.Property do
use MyApp.Web, :model
embedded_schema do
field :name, :string
field :value, :float, default: 0
end
def changeset(struct, params \\ %{}) do
struct
|> cast(params, [:name, :value])
|> validate_required([:name])
end
end
The migration I'm using is:
defmodule MyApp.Repo.Migrations.CreateOwner do
use Ecto.Migration
def down do
drop table(:owners)
end
def change do
drop_if_exists table(:owners)
create table(:owners) do
add :name, :string
add :properties, :map
timestamps()
end
end
end
And a possible seed is:
alias MyApp.{Repo, Owner, Property}
Repo.insert!(%Owner{
name: "John Doe",
properties: [
%Property{
name: "Property A"
},
%Property{
name: "Property B",
value: 100000
},
%Property{
name: "Property C",
value: 200000
}
]
})
Finally, my questions are: how can I update John Doe's Property C's value from 200000to 300000? And if John Doe buys a Property D:
%Property{
name: "Property D"
value: 400000
}
How do I add that to his properties in the database? (I'm using Postgres).
The simplest way would be to fetch the record, update the properties
list and save the changes:
owner = Repo.get!(Owner, 1)
properties = owner.properties
# update all properties with name "Property C"'s value to 400000
properties = for %Property{name: name} = property <- properties do
if name == "Property C" do
%{property | value: 400000}
else
property
end
end
# add a property to the start
properties = [%Property{name: "Property D", value: 400000} | properties]
# or to the end
# properties = properties ++ [%Property{name: "Property D", value: 400000}]
# update
Owner.changeset(owner, %{properties: properties})
|> Repo.update!
You can do some operations (at least inserting a property) using the JSON functions
provided by PostgreSQL using
fragment but I don't think you can search and conditionally update an item
of an array using them.

How to perform atom update in embedded schema model with `jsonb_set`?

How to update only one key in map, I would like to perform it by jsonb_set like here: stackoverflow example or in transaction to avoid potential conflicts in database, is it possible with Ecto?
defmodule MySuperApp.Profile do
use MySuperApp.Model
schema "profiles" do
field :name, :string
embeds_one :settigns, MySuperApp.Settigns
end
def changeset(struct, params) do
struct
|> change
|> put_embed(:settigns, MySuerApp.Settigns.changeset(model, params))
end
end
defmodule MySuperApp.Settigns do
use MySuperApp.Model
#settigns %{socket: true, page: true, android: false, ios: false}
embedded_schema do
field :follow, :boolean
field :action, :map, default: #settigns
end
def changeset(struct, _params) do
# I would like to update only web key and leave old keys
model |> change(action: %{web: false}) # this will override old map -> changes: %{action: %{web: false}
end
end
No. Ecto currently does not support partial updates of the embeds with the high-level API (like changesets).
You could achieve this by using raw SQL queries through Ecto.Adapters.SQL.query/4 or in more recent versions Repo.query/3.

How to update one sub-document in an embedded list with Ecto?

I have a document with an embedded list of sub-docs. How do I update/change one particular document in the embedded list with Ecto?
defmodule MyApp.Thing do
use MyApp.Model
schema "things" do
embeds_many :users, User
end
end
defmodule MyApp.User do
use MyApp.Model
embedded_schema do
field :name, :string
field :email, :string
field :admin, :boolean, default: false
end
end
defmodule MyApp.Model do
defmacro __using__(_) do
quote do
use MyApp.Web, :model
#primary_key {:id, :binary_id, autogenerate: true}
#foreign_key_type :binary_id # For associations
end
end
end
My solution so far is to generate a list of all users except the one I want to update and make a new list of the one user's changeset and the other users and then put_embed this list on the thing. It works but it feels like there must be a more elegant solution to this.
user = Enum.find(thing.users, fn user -> user.id == user_id end)
other_users = Enum.filter(thing.users, fn user -> user.id != user_id end)
user_cs = User.changeset(user, %{email: email})
users = [user_cs | other_users]
thing
|> Ecto.Changeset.change
|> Ecto.Changeset.put_embed(:users, users)
|> Repo.update
EDIT: I just discovered a serious pitfall with this "solution". The untouched users get updated as well which can be a problem with concurring calls (race condition). So there has to be another solution.

Mongoid/MongoDB: Order a query by the value of an embedded document?

I am attempting to order the results of a query by the value of a specific embedded document, but even with what seems to be a valid set of options and using the $elemMatch operator, my results are coming back in natural order.
My model is composed of Cards, which embeds_many :card_attributes, which in turn reference a specific CardAttributeField and contain an Integer value. I would like to be able to order a collection of Cards by that value.
I am able to isolate a collection of Cards which have a CardAttribute referencing a specific CardAttributeField like this:
cards = Card.where(:card_attributes.elem_match => {
:card_attribute_field_id => card_attribute_field.id
})
If I knew the order in which the card_attributes were set, I could use MongoDB array notation, like this:
cards.order_by(['card_attributes.0.value', :asc])
This does deliver my expected results in test scenarios, but it won't work in the real world.
After much messing around, I found a syntax which I thought would allow me to match a field without using array notation:
cards.asc(:'card_attributes.value'.elem_match => {
:card_attribute_field_id => card_attribute_field.id
})
This produced a set of options on the resulting Mongoid::Criteria which looked like:
{:sort=>{"{#<Origin::Key:0x2b897548 #expanded=nil, #operator=\"$elemMatch\", #name=:\"card_attributes.value\", #strategy=:__override__, #block=nil>=>{:card_attribute_field_id=>\"54c6c6fe2617f55611000068\"}}"=>1}}
However, the results here came back in the same order regardless or whether I called asc() or desc().
Is there any way to do what I'm after? Am I taking the wrong approach, or do I have a mistake in my implementation? Thanks.
Simplified, my model is:
class Card
include Mongoid::Document
# various other fields
has_many :card_attribute_fields
embeds_many :card_attributes do
def for_attribute_field card_attribute_field
where(:card_attribute_field_id => card_attribute_field.id)
end
end
end
class CardAttributeField
include Mongoid::Document
belongs_to :card
field :name, type: String
field :default_value, type: String
field :description, type: String
end
class CardAttribute
include Mongoid::Document
embedded_in :card
field :card_attribute_field_id, type: Moped::BSON::ObjectId
field :value, type: Integer
end