PHPUnit Mock of MongoCollection class creating a _PHP_Incomplete_Class object - mongodb

Today I came back to a project I have not touched for a while. It is a Zend Framework 2 project using MongoDB as its database.
I decided since it had been a while to update MongoDB to the latest version (2.4) from (2.0), and the driver to the latest (1.4.2?).
Now when running my PHPUnit tests I get errors due to a mocked MongoCollection class failing a "is_a()" test.
Instead of the new mocked class being an instance of MongoCollection, it turns out to be a _PHP_Incomplete_Class instance instead. I have been searching high and low and I cannot find anyone with the same issue. I can only assume something has changed to the MongoDB classes that PHPUnit doesn't like.
$collection = $this->getMockBuilder('MongoCollection')
->disableOriginalConstructor()
->getMock();
When inspecting $collection I see:
_PHP_Incomplete_Class Object {
_PHP_Incomplete_Class_Name => (string) Mock_MongoCollection_2798b1f7"
}
Does anyone know a way around this or do I need to bash out my own MongoCollection mock/test class to test with?

OK, after a few days of doing other things I came back to this issue and have solved it.
I updated PHPUnit to the latest version (as of today, 1 August 2013) via PEAR and the issue has gone away.
The lesson: try updating everything and not just one component!

Related

Use of Extbase Repositories in Symfony Command

Im upgrading an extension to work with TYPO3 v10. Since command controllers can not be used anymore, im migrating them to symfony commands as pointed by the documentation. Everything works smooth as heck except for the usage of extbase repository classes. No matter what i query, i never get a result. Since i can't find any useful information on the web and the documentation i hope this may be just something minor.
After debugging for a while i found out that the pid is not determined correctly while building the query settings. I find that kind of strange since my root template has these lines:
plugin.tx_myext.persistence.storagePid = 15403
module.tx_myext.persistence.storagePid = 15403
The repository instances are correctly injected by injectMyRepository() methods. I've tried using the extbase ObjectManager to fetch the class instances instead but the "error" stays the same.
Am i doing something wrong or is it not possible to use extbase repository classes in symfony commands?
After more research i found out that there is some bootstraping missing which results in extension settings (the storageID in my case) not being loaded. From what i've been reading, that behaviour seems intended to prevent extbase booting, i guess?
There is a reference to something similiar in the official documentation: https://docs.typo3.org/m/typo3/reference-coreapi/master/en-us/ApiOverview/CommandControllers/Index.html#initialize-backend-user
Knowing that, i tried to find a method to initialize the missing settings which i could not find. So this does indeed seem like a missing feature.
I developed a workaround which i'm not too proud of, but it's better than nothing (or rebuilding everything to doctrine for that matter). If you stumble upon the same issue, here you go. Just insert and call this method before you fire your query:
public static function initializeConfigurationManager(): void
{
/** #var ConfigurationManager $configurationManager */
$configurationManager = GeneralUtility::makeInstance(ConfigurationManager::class);
$tmpConfiguration = $configurationManager->getConfiguration(
ConfigurationManagerInterface::CONFIGURATION_TYPE_FRAMEWORK,
'myExtensionName'
);
$configurationManager->setConfiguration($tmpConfiguration);
}
That approach exploits the singleton state of the ConfigurationManager. You simply inject the static template of your extension manually and every extbase compound (like repositories) will then use these settings from there on. Lovely.
Be aware however, this is prone to break with future internal changes.

How to unit test Kafka Streams

While exploring how to unit test a Kafka Stream I came across ProcessorTopologyTestDriver, unfortunately this class seems to have gotten broken with version 0.10.1.0 (KAFKA-4408)
Is there a work around available for the KTable issue?
I saw the "Mocked Streams" project but first it uses version 0.10.2.0, while I'm on 0.10.1.1 and second it is Scala, while my tests are Java/Groovy.
Any help here on how to unit test a stream without having to bootstrap zookeeper/kafka would be great.
Note: I do have integration tests that use embedded servers, this is for unit tests, aka fast, simple tests.
EDIT
Thank you to Ramon Garcia
For people arriving here in Google searches, please note that the test driver class is now org.apache.kafka.streams.TopologyTestDriver
This class is in the maven package groupId org.apache.kafka, artifactId kafka-streams-test-utils
I found a way around this, I'm not sure it is THE answer especially after https://stackoverflow.com/users/4953079/matthias-j-sax comment. In any case, sharing what I have so far...
I completely copied ProcessorTopologyTestDriver from the 0.10.1 branch (that's the version I'm using).
To address KAFKA-4408 I made private final MockConsumer<byte[], byte[]> restoreStateConsumer accessible and moved the chunk task = new StreamTask(... to a separate method, e.g. bootstrap.
On the setup phase of my test I do the following
driver = new ProcessorTopologyTestDriver(config, builder)
ArrayList partitionInfos = new ArrayList();
partitionInfos.add(new PartitionInfo('my_ktable', 1, (Node) null, (Node[]) null, (Node[]) null));
driver.restoreStateConsumer.updatePartitions('my_ktable', partitionInfos);
driver.restoreStateConsumer.updateEndOffsets(Collections.singletonMap(new TopicPartition('my_ktable', 1), Long.valueOf(0L)));
driver.bootstrap()
And that's it...
Bonus
I also ran into KAFKA-4461, fortunately since I copied the whole class I was able to "cherry-pick" the accepted fix with minor tweaks.
As always feedback is appreciated. Although apparently not an official test class, this driver is proven super useful!
For people arriving here in Google searches, please note that the test driver class is now org.apache.kafka.streams.TopologyTestDriver
This class is in the maven package groupId org.apache.kafka, artifactId kafka-streams-test-utils

Problems compiling routes after migrating to Play 2.1

After migrating to Play-2.1 I stuck into problem that routes compiler stopped working for my routes file. It's been completely fine with Play-2.0.4, but now I'm getting the build error and can't find any workaround for it.
In my project I'm using cake pattern, so controller actions are visible not through <package>.<controller class>.<action>, but through <package>.<component registry>.<controller instance>.<action>. New Play routes compiler is using all action path components except for the last two to form package name that will be used in managed sources (as far as I can get code in https://github.com/playframework/Play20/blob/2.1.0/framework/src/routes-compiler/src/main/scala/play/router/RoutesCompiler.scala). In my case it leads to situation when <package>.<component registry> is chosen as package name, which results in error during build:
[error] server/target/scala-2.10/src_managed/main/com/grumpycats/mmmtg/componentsRegistry/routes.java:5: componentsRegistry is already defined as object componentsRegistry
[error] package com.grumpycats.mmmtg.componentsRegistry;
I made the sample project to demonstrate this problem: https://github.com/rmihael/play-2.1-routes-problem
Is it possible to workaround this problem somehow without dropping cake pattern for controllers? It's the pity that I can't proceed with Play 2.1 due to this problem.
Because of reputation I can not create a comment.
The convention is that classes and objects start with upper case. This convention is applied to pattern matching as well. Looking at a string there seems to be no difference between a package object and normal object (appart from the case). I am not sure how Play 2.1 handles things, that's why this is not an answer but a comment.
You could try the new # syntax in the router. That allows you to create an instance from the Global class. You would still specify <package>.<controller class>.<action>, but in the Global you get it from somewhere else (for example a component registry).
You can find a bit of extra information here under the 'Managed Controller classes instantiation': http://www.playframework.com/documentation/2.1.0/Highlights
This demo project shows it's usage: https://github.com/guillaumebort/play20-spring-demo

Casbah Scala MongoDB driver - a strange error

I am trying to use Casbah, I get a strange error right in the beginning, on this line:
val mongoDB = MongoConnection("MyDatabase")
the error on MongoConenction says:
class file needed by MongoConnection is missing. reference type
MongoOptions of package com.mongodb refers to nonexisting symbol.
I do not know what to do with this. The jars that I have attached to my projects are:
casbah-commons_2.9.1-3.0.0-SNAPSHOT.jar
casbah-core_2.9.1-3.0.0-SNAPSHOT.jar
casbah-gridfs_2.9.1-3.0.0-SNAPSHOT.jar
casbah-query_2.9.1-3.0.0-SNAPSHOT.jar
casbah-util_2.9.1-3.0.0-SNAPSHOT.jar
which looks like a full setup of Casbah and I do not understand what it might be yearning for. So there is the question number one - what do I have to do to resolve this problem?
The question number two is - the Casbah tutorial says that I could import just one thing, and get the mongoConn() method, which is also not truth. The mongoConn() simply does not get found if I follow the instructions. So, how can I acheive that everythong works as in the tutorial?
I don't know the details of your setup, but it seems like you are not referencing the dependencies of the casbah-commons module.
According to the docs, those are:
mongo-java-driver, scalaj-collection, scalaj-time, JodaTime, slf4j-api

MongoDB Java / Scala drivers - Missing methods

I'm trying to convert a persistence layer from a plain old database (using ScalaQuery) to MongoDB, and I'm running into an odd issue. I use the Casbah driver, which is a Scala wrapper around the official MongoDB Java driver. Both the Java and Scala driver define - according to the docs and the overview of the .jar when I open it in Eclipse - a method findOneById that takes a single DBObject as parameter (with an ID in it).
However, when I try to access it, I get a missing method exception from the Scala compiler, both in Eclipse and SBT - Scala version 2.9.0-1, SBT 0.10.1.
What might cause this? Is this perhaps a known SBT / Scala compiler bug?
I just removed my entire repository so all dependencies get downloaded freshly, but this didn't fix the problems.
Are you sure that you call findOneById on a MongoCollection instance ?
Maybe it's the parameter type that is wrong, as I can see on the documentation (http://api.mongodb.org/scala/casbah/2.1.2/scaladoc/com/mongodb/casbah/MongoCollection.html), findOneById should take an Id of type AnyRef and optionnaly the fields to return.
You should try something like mongoCollection.findOneByID(1.asInstanceOf[Object]).
Regarding BBObject, it seems that it doesn't appear in the list of parameter (except as an implicit parameter useful to convert the fields that you request to a DBObject). Maybe the signature of the method changed since a previous release.
Hope this will help.