Dependency Injection in Symfony Commands in TYPO3 - typo3

Let's say I have the following Symfony Command in TYPO3:
class MyCommand extends Command
{
public function __constructor(string $name=null)
{
parent::__construct($name);
}
protected function execute(InputInterface $input, OutputInterface $output)
{
...
}
}
How is it possible to pass a class via dependency injection to the constructor without using the ObjectManager?

That depends on your TYPO3 version. With v10 TYPO3 uses PSR-11 and therefore should provide Dependency Injection for Commands.
Prior to that version that's not possible. I would suggest to move your own logic out of the command into dedicated classes. These could be loaded via ObjectManager and therefore use Dependency Injection already.
That approach would separate your logic from framework architecture (commands).

Related

Adding custom jars to camel-k integration

Lets say I have utility jar called validation.jar
which can validate some string to be forwarded.
How can I add this validation.jar along with camel-k integration in minkube.
example :
import com.validation.Util;
import org.apache.camel.builder.RouteBuilder;
public class MyRoute extends RouteBuilder {
#Override
public void configure() throws Exception {
//from("timer:tick").log(Util.validate("dummy message - new"));
from("timer:tick").log("dummy message - new");
}
}
to get com.validation.Util class we need validation.jar available with camel-k. How to provide that.
There is just one way of achieving your goal. Store your java classses with pom.xml file for building jar in github repository. After that you can use Jitpack.io. Jitpack will build and store jar file in its registry. Finally, you can use as dependency on kamel run command.
Starting from camel-k 1.9.x, it is possible to provide a dependency located in the local filesystem by using the file:// prefix when specifying the dependency, like the next example:
kamel run -d file://path/to/validation.jar MyRoute.java

Arango spring-data documentation no longer current. Not clear what you should do for new coding model

I am struggling with changes in the most current arangodb-spring-data project code that doesn't align with the older documentation.
Here is my refreshed gradle build dependencies
plugins {
id 'org.springframework.boot' version '2.2.0.RELEASE'
id 'groovy'
}
apply plugin: 'io.spring.dependency-management'
...
dependencies {
implementation 'org.springframework.boot:spring-boot-starter'
//implementation 'org.codehaus.groovy:groovy'
implementation group: 'org.codehaus.groovy', name: 'groovy-all', version: '3.0.0-rc-1'
// testImplementation group: 'org.spockframework', name: 'spock-core', version: '1.3-groovy-2.5'
developmentOnly 'org.springframework.boot:spring-boot-devtools'
annotationProcessor 'org.springframework.boot:spring-boot-configuration-processor'
testImplementation 'org.springframework.boot:spring-boot-starter-test'
implementation 'com.arangodb:arangodb-spring-data:3.2.3'
implementation 'com.arangodb:arangodb-java-driver:6.4.1'
}
I have a arangoConfiguration class like this:
#Configuration
#EnableArangoRepositories(basePackages = [ "com.softwood.arango" ])
class ArangoConfiguration extends AbstractArangoConfiguration { //changed since 3.2.0
//class ArangoConfiguration {
#Override
public Builder arango() {
return new ArangoDB.Builder().host("localhost", 8529).user("root").password(null)
}
#Override
public String database() {
return "testDB"
}
}
However in the IDE it now shows that extends AbstractArangoConfiguration is deprecated. There is now an interface instead called ArangoConfiguration in my ArangoConfiguration class.
I tried changing the code to implements this interface and it doesn't work, as I get a bean dependency failure with ArangoOperations not being defined - which if you extend the deprecated abstract class doesn't happen. Spring fails on injecting the operations autowired bean - if you try and implement from the ArangoConfiguration interface in the copy of the demo code is here in github sample code.
My CrudRunner looks like this
#ComponentScan("com.softwood.arango")
public class CrudRunner implements CommandLineRunner {
#Autowired
private ArangoOperations operations
#Autowired
private OrganisationRepository orgRepo
#Autowired
private SiteRepository siteRepo
#Autowired
private OperatesFromManyRepository ownsRepo //edge relationship
....
The code works if I revert to the deprecated model approach - however I still get separate unrelated warning with a reflection problem using Java 11.0.5. This doesn't stop it working but the code base is doing something that Java 11 doesn't like.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.springframework.cglib.core.ReflectUtils (file:/C:/Users/will/.gradle/caches/modules-2/files-2.1/org.springframework/spring-core/5.2.0.RELEASE/e0e1b3c304f70ed19d7905975f6f990916ada219/spring-core-5.2.0.RELEASE.jar) to method java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
WARNING: Please consider reporting this to the maintainers of org.springframework.cglib.core.ReflectUtils
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
There is nothing updated on the arangodb-spring-data usage documentation to reflect the go forward way with the new interface.
What should the latest version V.3.0+ versions look like?
The Java 11 warning is a problem - but the code still seems to run but will need a fix; it doesn't happen on Java 8.
If any one knows the new approach can they share that, else I'll have to post across on the Git project pages.
The documentation has been updated: https://github.com/arangodb/docs/pull/541
Furthermore ArangoConfiguration does not add anything to AbstractArangoConfiguration, as you can see here: https://github.com/arangodb/spring-data/blob/4ae3130af345a0314a215605aa38fb7c88d41d5b/src/main/java/com/arangodb/springframework/config/AbstractArangoConfiguration.java, and it should work exactly in the same way (except for the deprecation warning). So maybe your injection problems are due to some other reason.
Also: illegal reflective access warnings when using Java 11 are not due to Spring Data Arango, but from the Spring Framework: https://github.com/spring-projects/spring-framework/issues/22674

Guice Scala module - No valid constructors

I have playframework application written in scala. Problem is when I want to add new module for Silhouette. My module class is very similar to one from Silhouette example. I can run application trough sbt with simple run command but when I build jar using sbt-assembly and try run it I get:
No valid constructors
at play.api.inject.Modules$.$anonfun$constructModule$6(Module.scala:155)
at scala.Option.getOrElse(Option.scala:138)
at play.api.inject.Modules$.constructModule(Module.scala:155)
at play.api.inject.Modules$.$anonfun$locate$4(Module.scala:127)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:237)
at scala.collection.immutable.HashSet$HashSet1.foreach(HashSet.scala:321)
at scala.collection.immutable.HashSet$HashTrieSet.foreach(HashSet.scala:977)
at scala.collection.TraversableLike.map(TraversableLike.scala:237)
at scala.collection.TraversableLike.map$(TraversableLike.scala:230)
at scala.collection.AbstractSet.scala$collection$SetLike$$super$map(Set.scala:51)
at scala.collection.SetLike.map(SetLike.scala:104)
at scala.collection.SetLike.map$(SetLike.scala:104)
at scala.collection.AbstractSet.map(Set.scala:51)
at play.api.inject.Modules$.locate(Module.scala:125)
at play.api.inject.guice.GuiceableModule$.loadModules(GuiceInjectorBuilder.scala:276)
at play.api.inject.guice.GuiceApplicationBuilder$.$anonfun$$lessinit$greater$default$9$1(GuiceApplicationBuilder.scala:30)
at play.api.inject.guice.GuiceApplicationBuilder.applicationModule(GuiceApplicationBuilder.scala:102)
at play.api.inject.guice.GuiceBuilder.injector(GuiceInjectorBuilder.scala:185)
at play.api.inject.guice.GuiceApplicationBuilder.build(GuiceApplicationBuilder.scala:137)
at play.api.inject.guice.GuiceApplicationLoader.load(GuiceApplicationLoader.scala:21)
at play.core.server.ProdServerStart$.start(ProdServerStart.scala:51)
at play.core.server.ProdServerStart$.main(ProdServerStart.scala:25)
at play.core.server.ProdServerStart.main(ProdServerStart.scala)
I had similar issue and solved it by adding configuration as a parameter to constructor, for some reason it searches for constructor with configuration, not sure if this is the same issue as yours.
import com.typesafe.config.Config;
#Inject
public TradeClearingWorkboardGuiceModule(Environment environment, Config configuration) {
}

Azure Functions: Can compile but cannot run with custom datalayer library

I've tried to come up with a better title but can't.
The issue is I am new to Azure functions but have made a simple one work that writes to a SQL Azure table. Now I've attempted to build the simplest kind of Entity Framework based Datalayer and uploaded it. Right now it is compiled as .Net 4.6 and using EF 6.1.3.
I'm using a connection string as per the second answer here Second answer and have checked it is being retrieved correctly. Update - I also used this guide.
Removing this {#r "D:\home\site\wwwroot\sharedbin\TestDataLayer.dll"} causes the editor to complain about missing assemblies, so it IS finding the dll in question.
However it will not run - it cannot find TestDataLayer.dll.
I'm only running this in the portal editor (I've not yet mastered deployment direct from a Visual Studio Project - don't laugh :P).
#r "System.Configuration"
#r "System.Data.Entity"
#r "D:\home\site\wwwroot\sharedbin\TestDataLayer.dll"
using System;
using System.Collections;
using System.Configuration;
using System.Collections.Generic;
using System.Data.Entity;
using System.Data.Entity.ModelConfiguration.Conventions;
using System.Data.Entity.SqlServer;
using System.Threading.Tasks;
using System.ComponentModel.DataAnnotations;
using System.Net;
using TestDataLayer;
public static void Run(TimerInfo myTimer, TraceWriter log)
{
var connection = ConfigurationManager.ConnectionStrings["sql_connection"].ConnectionString;
using(var db = new SyncDbContext(connection))
{
var RK = new RKAzureTest() {TestField1 = "It finally worked?" };
db.RKAzureTests.Add(RK);
db.SaveChanges();
}
}
[DbConfigurationType(typeof(myDBContextConfig))]
public partial class SyncDbContext : System.Data.Entity.DbContext
{
public SyncDbContext(string cs) : base(cs) {}
public DbSet<RKAzureTest> RKAzureTests {get;set;}
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
// modelBuilder.Conventions.Remove<PluralizingTableNameConvention>();
}
}
public class myDBContextConfig : DbConfiguration
{
public myDBContextConfig()
{
SetProviderServices("System.Data.EntityClient",
System.Data.Entity.SqlServer.SqlProviderServices.Instance);
SetDefaultConnectionFactory(new System.Data.Entity.Infrastructure.SqlConnectionFactory());
}
}
This is the function.json:
{
"frameworks": {
"net46":{
"dependencies": {
"EntityFramework": "6.1.3"
}
}
}
}
I've compiled the dll itself to .Net 4.6 after a suspicion that the Azure Functions don't support .net 4.7.1 and via Kudu uploaded the compiled dll to a sharedbin folder (checked the path a dozen times!).
This is the error thrown up:
2018-05-01T11:00:00.012 [Warning] Unable to find assembly 'TestDataLayer, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null'. Are you missing a private assembly file?
2018-05-01T11:00:00.012 [Error] Exception while executing function: Functions.TimerTriggerCSharp1. mscorlib: Exception has been thrown by the target of an invocation. f-TimerTriggerCSharp1__514732255: Could not load file or assembly 'TestDataLayer, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' or one of its dependencies. The system cannot find the file specified.
Not quite sure what else can be left - I'm using runtime version 1.0.11702 in the Application settings as I found life got a LOT more complicated if I went onto the Beta version.
If anyone can point me to a working guide for this use case (Database first, EF 6.1.3 etc) I'd be grateful.
Any help offered gratefully received!
Thank you :)
Go to Azure Portal, create a folder called, 'bin' inside your Azure functions using CMD Shell, upload the 'TestDataLayer.dll' file to bin folder which has just been created.
#r "System.Configuration"
#r "System.Data.Entity"
#r "TestDataLayer.dll"
Project structure should look like,
AzureFunctionProjectName001
bin
TestDataLayer.dll
run.csx
project.json
project.lock.json
...
Azure functions should be able to discover your library this time. I believe, EntityFramework works just fine.

JaxB #XmlRootElement result in "Cannot resolve xml element declaration"

I was using JAXB as DTO to set a stable interface between Server and Clients. Anyways this doesn't matter. What matters is I created a set of classes that result in the following compile error.
Cannot resolve XML element declaration with namespace 'namespace' and
name 'name' in this context
Eclipse underlined "name" inside quotes as an error. This class is manually created instead of xjc generated.
#XmlRootElement(name="name", namespace="namespace")
#XmlType(name="")
public class UserDTO {
private UserType userType;
#XmlElement
public UserType getDTO(){
return userType;
}
public void setDTO(UserType userType){
this.userType=userType;
}
}
where UserType is a xjc generated class
#XmlAccessorType(XmlAccessType.FIELD)
#XmlType(name = "userType", propOrder = {
"userId",
"userName"
})
public class UserType {getter;setter}
So basically UserDTO is just a wrapper that wraps up sub jaxb types.
I'm not sure if it was platform dependent (which it shouldn't be), anyways, just to mention that this code worked perfectly on Netbeans, but when it come to Eclipse, the error prevented the compilation.
The Environment running the project was:
1. MacOsX Lion
2. JDK: 1.6.0_37
3. Eclipse Version: Juno with Package 1
4. JAXB Platform: Generic JAXB 2.1
Please anyone can share some idea?
ps: I added the JDK info and Libraries setting as Manuel suggested.
I faced this issue, you have more than one schema, at least two, both of them don't have namespace, just assign namespace to one of them.