I've started using the new mocking support in grails-datastore-gorm-mongodb. My app defaults domain mappings to use references when persisting relationships to mongodb. I need to find a way to get the mocked mongo to do the same thing. How do I apply the same default mapping in a unit test?
In Config.groovy, it looks like this:
// configure mongo to use dbrefs:
grails.mongo.default.mapping = {
'*'(reference: true)
}
Here's a sample of code that I currently use:
import spock.lang.*
import grails.test.mixin.mongodb.MongoDbTestMixin
import com.github.fakemongo.Fongo
#Mixin([MongoDbTestMixin])
class MySpec extends Specification {
def setup() {
mongoDomain(new Fongo("test").mongo, [ MyDomain ])
new MyDomain(name: 'domain').save(validate: false, flush: true)
}
}
How do I apply that config to this test code?
I'm using Grails 2.3.9 and mongodb 3.0.1 plugin.
Looks like MongoDbTestMixin offers a few flavors of the mongoDomain method:
mongoDomain(Mongo mongo, Collection<Class> persistentClasses) - Sets up a GORM for MongoDB domain for the given Mongo instance and domain classes
mongoDomain(Map config, Collection<Class> persistentClasses) - Sets up a GORM for MongoDB domain for the given configuration and domain classes
The 2nd option allows to pass a configuration map which allows to configure mongo to use dbrefs (otherwise an empty configuration is used, see MongoDbDataStoreSpringInitializer ). However this method does not allow you to pass the Fongo instance.
You can try to:
Ask the Grails team to add a method which combines both options (pull request?)
Extend MongoDbTestMixin or create your own mixin
Related
I am not able to convert domain class into Basic DB object.
Below is my code:
def update_val
class_object.class.withNewSession { MongoCodecSession m ->
update_val = m.pendingUpdates.find {
it.key.name == d.class.getName()
}.value[0]nativeEntry.regions[0]."${instance.getDbKey()}"[0]
}
On below findOneAndUpdate function, I am getting error: "Can't find a codec for class class.domain". updateVal is returning as Domain Class object.
ClassName.class.findOneAndUpdate(new BasicDBObject(findVal), new BasicDBObject(updateval))
I am converting it from Grails 3.0 to Grails 3.1, here nativeEntry is returning as a domain class while in previous version, nativeEntry is returning as BasicDBObject.
Any solution?
I am using Grails 3.1 with gorm 5.0 and mongodb 3.4
I have resolved it. Add below solution to Application.yml
grails:
mongodb:
engine: mapping
It will convert MongoCodecSession to the previous MongoSession.
As in codecs, objects are no longer converted first to MongoDB Document objects and then to Groovy objects, instead the driver reads Groovy objects directly from the JSON stream at the driver level, which is far more efficient than the previous MongoSession.
I'm trying to create a new Symfony4 project with MongoDB.
First I created a Symfony4 project using this documentation:
https://symfony.com/doc/current/setup.html
Then I included MongoDB using this documentation:
http://symfony.com/doc/current/bundles/DoctrineMongoDBBundle/index.html
I tried to follow the instructions as exactly as possible (for example I didn't need to add anything to app/AppKernel.php, but MongoDB was automatically added to config/bundles.php).
Now I think everything should work, but my Symfony app doesn't find the MongoDB Service:
You have requested a non-existent service "doctrine_mongodb".
Did you mean one of these: "http_kernel", "request_stack", "router"?
in ServiceLocator.php (line 48)
Controller:
namespace App\Controller;
use App\Document\Chapter;
use Symfony\Bundle\FrameworkBundle\Controller\AbstractController;
use Symfony\Component\HttpFoundation\Response;
class DefaultController extends AbstractController {
public function createAction() {
$test = new Chapter();
$test->setHeadline('Test');
$dm = $this->get('doctrine_mongodb')->getManager();
$dm->persist($test);
$dm->flush();
return new Response('Created product id '.$test->getId());
}
}
However, If I execute this on the console:
php bin/console debug:container
I get a list of services including these:
doctrine_mongodb Doctrine\Bundle\MongoDBBundle\ManagerRegistry
doctrine_mongodb.odm.default_connection Doctrine\MongoDB\Connection
doctrine_mongodb.odm.default_document_manager Doctrine\ODM\MongoDB\DocumentManager
doctrine_mongodb.odm.document_manager alias for "doctrine_mongodb.odm.default_document_manager"
So the service seems to be there, but Symfony can't load it from my app.
Any idea how I could solve this?
Is it possible that the Mongo-DB Server connection doesn't work and for some reason it isn't logged and the service just won't load?
You could use autowiring
use Doctrine\ODM\MongoDB\DocumentManager as DocumentManager;
and
public function createProduct(DocumentManager $dm)
Try extending from "Controller" instead "AbstractController".
class DefaultController extends Controller
Heey all,
I'm having troubles to set up a testcase.
I have a plain symfony 3 project connected to mongodb. I have multiple documents which each needs an extra method to query the database. The method will get the last document inserted in the collection and is called findLatestInserted().
This specific function was duplicated in each document repository. So I decided to extract it and create a class BaseDocumentRepository which extends the default DocumentRepository. All of my document repositories still have their own DocumentRepository class let's say: CpuInfoRepository, RamInfoRepository. These classes do offer a few extra methods to query the mongodb database and the one in common: findLatestInserted()
It all works fine but just in case i'd wanted to write a unit test for this method findLatestInserted().
I have a test database called prototyping-test which is used to create a document and query it and check the result. Afterwards it'll clear itself so no documents stays in. For each repository there's a specific url to post data to to create a file in the database. To create a CpuInfo collection you'll post data to http://localhost:8000/ServerInfo/CreateCpuInfo. To create a RamInfo collection you'll post data to http://localhost:8000/ServerInfo/CreateRamInfo.
So here follows my question how would i write a test to test the method findLatestInserted()?
this is what i've tried so far:
public function testFindLatestInserted()
{
$client = self::createClient();
$crawler = $client->request('POST',
'/ServerInfo/CreateCpuInfo',
[
"hostname" => $this->hostname,
"timestamp" => $this->timestamp,
"cpuCores" => $this->cpuCores,
"cpu1" => $this->cpu1,
"cpu2" => $this->cpu2
]);
$this->assertTrue($client->getResponse()->isSuccessful());
$serializer = $this->container->get('jms_serializer');
$cpuInfo = $serializer->deserialize($client->getResponse()->getContent(), 'AppBundle\Document\CpuInfo', 'json');
$expected = $this->dm->getRepository("AppBundle:CpuInfo")->find($cpuInfo->getId());
$stub = $this->getMockForAbstractClass('BaseDocumentRepository');
$actual = $this->dm
->getRepository('AppBundle:CpuInfo')
->findLatestInserted();
$this->assertNotNull($actual);
$this->assertEquals($expected, $actual);
}
At the line $actual = $this->dm->getRepository('AppBundle:CpuInfo')->findLatestInserted(); i got stuck. As this would only test for CpuInfo while there is RamInfo too (and some other classes not mentioned here). How would one approach this setting?
I specificly want to test the method findLatestInserted() on the level of the abstract class instead of the concrete classes.
Please help me out!
Instead of testing the whole stack, just concentrate on testing findLatestInserted() in concrete classes.
Inject MondoDB stub into AppBundle:CpuInfo and check if findLatestInserted() returns expected value.
Do the same for AppBundle:RamInfo.
Avoid testing abstract classes, always test concrete classes.
In future, you may decide not to inherit from BaseDocumentRepository and may not notice that new implementation of findLatestInserted() fails.
I decided to try scala out with play2. I am trying to somehow get a config section out of application config. It looks like this (by section I mean whole mail part)
services: {
rest: {
mail: {
uri: "xyz",
authorization: {
username: "xyz",
password: "xyz"
}
}
}
}
Code
import com.typesafe.config.ConfigObject
import play.api.Play.current
val config: Option[ConfigObject] = current.configuration.getObject("services.rest.mail")
This gives Some(SimpleConfigObject()) and trough there the only way I am able to actually get mail section and use it as a ConfigObject is trough
config.get.toConfig.getString("uri")
Or I can get the actual value with
config.get.get("uri").unwrapped().toString
Or for fun:
config.get.toConfig.getObject("authorization").toConfig.getString("username")
Either way it seems to me I am doing it overly complicated. Is there some easier way to get a section from config?
Since the config library has a Java API, it can feel a bit verbose using it from Scala. There are some Scala wrappers though that enable more compact syntax. See https://github.com/typesafehub/config#scala-wrappers-for-the-java-library.
I will post this as an answer for future reference.
After another while of playing with the code I have found parseResourcesAnySyntax method which does exactly what I want and since I have my config split into multiple parts for separate environments (application.dev.conf, etc.) I can simply do
import com.typesafe.config.{Config, ConfigFactory}
import play.api.Play._
val config: Config = ConfigFactory.parseResourcesAnySyntax("application.%s.conf" format current.mode)
and then use
config.getString("uri")
// or
config.getString("authorization.username")
// or if I want to use part of it as another section
val section: Config = config.getConfig("authorization")
section.getString("username")
Of course, another viable alternative is using a wrapper as mister Stebel recommended.
I am running a play framework website that uses squeryl and mysql database.
I need to use squeryl to run all read queries to the slave and all write queries to the master.
How can I achieve this? either via squeryl or via jdbc connector itself.
Many thanks,
I don't tend to use MySQL myself, but here's an idea:
Based on the documentation here, the MySQL JDBC driver will round robin amongst the slaves if the readOnly attribute is properly set on the Connnection. In order to retrieve and change the current Connection you'll want to use code like
transaction {
val conn = Session.currentSession.connection
conn.setReadOnly(true)
//Your code here
}
Even better, you can create your own readOnlyTransaction method:
def readOnlyTransaction(f: => Unit) = {
transaction {
val conn = Session.currentSession.connection
val orig = conn.getReadOnly()
conn.setReadOnly(true)
f
conn.setReadOnly(orig)
}
}
Then use it like:
readOnlyTransaction {
//Your code here
}
You'll probably want to clean that up a bit so the default readOnly state is reset if an exception occurs, but you get the general idea.