I'm trying to configure the vscode phpunit extension to run tests using docker-compose, which is located outside the project (laradock).
Structure of directories :
W:\MyLaravelProject
W:\Laradock
docker-compose.yml
I have setup successfully PhpStorm. Anyway, I would like to switch on vscode but I cannot make it work.
Here is the launch command build by PhpStorm when I run the test :
[docker-compose://[W:\Laradock\docker-compose.yml]:php-fpm/]:php /var/www/MyLaravelProject/vendor/phpunit/phpunit/phpunit --configuration /var/www/MyLaravelProject/phpunit.xml --filter "/(::testOne)( .*)?$/" Tests\Feature\FirstTest /var/www/MyLaravelProject/tests/Feature/FirstTest.php --teamcity
Here is my current vscode configuration :
"phpunit.driverPriority": [
"Docker",
"Composer",
"Path",
"Phar",
"Ssh",
"GlobalPhpUnit"
],
"phpunit.clearOutputOnRun": true,
"phpunit.php": "docker-compose run php-fpm",
"phpunit.phpunit": "/vendor/phpunit/phpunit/phpunit",
"phpunit.paths": {
"W:\\MyLaravelProject": "/var/www/MyLaravelProject"
}
I expect the test to be run successfully using docker-compose.
How to configure phpunit to use docker-compose ?
How to specify another directory for docker-compose ?
Thank you in advance for your help
I'm sure in late with this answer but maybe I can help someone else.
I had a similar problem and I solved it using the Remote - Containers extension.
It lets you use the phpunit command inside the container.
You just have to install it and attach the extension to the workspace container, then run tests normally.
You can also change settings only in the attached remote so you can adjust phpunit command path.
Related
I'm trying to add an Azure Function project to a docker-compose file created in Visual Studio 2019 (16.7.6), but this causes the solution to fail to build.
(Docker for Windows 2.4.0.0 (48506), with WSL2 support enabled, running in Linux container mode on Windows 10 Pro 2004)
Steps to reproduce:
Create a new solution with a Web Api project 'Web' that includes docker
support.
Add a new Azure Function project 'Func' with an Http trigger
function and then added docker support via Visual Studio Add >
Docker Support option.
Add 'Container Orchestration Support' to Web
project to generate a docker-compose.yml file that has the web app
At this point the solution builds debugging works for the web or func app in docker or docker-compose - all good.
When 'Func' project is added manually to docker-compose.yml the solution no longer builds:
CTC1031 Linux containers are not supported for Func project
Project: docker-compose
File: Microsoft.VisualStudio.Docker.Compose.targets
Line 303
However, I can run docker-compose fine from the command line:
docker-compose -f docker-compose.yml up
and both Web and Func app start up fine.
My docker-compose.yml file is
version: '3.4'
services:
web:
image: ${DOCKER_REGISTRY-}web
build:
context: .
dockerfile: Web/Dockerfile
func:
image: ${DOCKER_REGISTRY-}func
build:
context: .
dockerfile: Func/Dockerfile
Any ideas why I get the above error when building the solution in Visual Studio?
Have the same issue. It seems that docker-compose project type allows to set up dependencies only to specific type of projects, like Sdk="Microsoft.NET.Sdk.Web", and does not allow to library projects like Sdk="Microsoft.NET.Sdk". Referencing Dockerfile from docker-compose.yml automatically sets up reference.
In case of wrong project type compose works, but compilation of docker-compose project fails.
I got the error as well:
CTC1031 Linux containers are not supported for
It probably means that your Output type for the project is set to Class Library instead of Console Application.
i have installed Jenkins on Windows 10 and each time i try to execute a maven project or just try to execute mvn clean test (command line), Jenkins decides that my new workspace should be: C:\Windows\system32\config\systemprofile\eclipse-workspace\ while my project is in C:\Users\username\eclipse-workspace.
Jenkins starts in the directory C:\Windows\system32\config\systemprofile\AppData\Local\Jenkins.jenkins\workspace\projectName and even if i run a cd command i will have this problem:
The driver executable must exist:
C:\Windows\system32\config\systemprofile\eclipse-workspace\projectname\drivers\chromedriver\chromedriver.exe
My chrome driver is not there obviously but it's in C:\Users\userName\eclipse-workspace\projectName\drivers\chromedriver.
It looks like than Jenkins changes my user.home.
I went to config file and set:
<workspaceDir>C:\Users\userName\eclipse-workspace</workspaceDir>
but it's still looking for the driver in C:\Windows\system32\config\systemprofile\eclipse-workspace\projectname\drivers\chromedriver\chromedriver.exe
This part: C:\Windows\system32\config\systemprofile\ is obtained using in Java System.getProperty("user.home"). Running it with Jenkins seems to modify it.
Why is it looking for my driver there?
Why can't it just stick to my workspace folder?
How can i solve this?
Thank you
I think i solved it. I was running under Local System Account
https://jenkins-le-guide-complet.github.io/html/sect-windows-service.html#fig-hudson-windows-service-config
I had edit the service as shown in the link.
I have a packer configuration that provisions using chef-solo in AWS EC2. This works well. I have introduced berkshelf to manage 3rd party cookbooks and this isn't working as well.
I am working in a chef repo that has locally developed cookbooks, roles, data bags, etc. By introducing berks, I want to keep the cookbooks directory clean, and put 3rd party cookbooks into vendor/cookbooks (which is git excluded so keeps repos clean / minimizes chances of other devs adding / pushing berks-managed cookbooks in to vcs). So I added a shell-local provisioner before a chef-solo provisioner, which runs berks vendor vendor/cookbooks and updated the chef-solo provisioner with cookbook_paths ["cookbooks","vendor/cookbooks"]. My idea is that the shell-local would run before the chef-solo and both cookbook paths would be available.
However, when I run packer build, it fails fast trying to resolve the cookbook paths before the AWS builder even starts building, failing with reference to the non-existent vendor/cookbooks directory. Here is the packer provisioners segment:
"provisioners" : [
{
"type": "shell-local",
"command": "bundle install && bundle exec berks vendor vendor/cookbooks"
},
{
"type" : "chef-solo",
"cookbook_paths" : ["cookbooks","vendor/cookbooks"],
"environments_path" : "environments",
"roles_path" : "roles",
"run_list" : ["role[somerole]"]
}
],
When I run this, it fails:
amazon-ebs output will be in this color.
1 error(s) occurred:
* Bad cookbook path 'vendor/cookbooks': stat vendor/cookbooks: no such file or directory
Is there a mechanism in packer that will run the shell-local first before resolving the chef-solo provisioner? I'd like to avoid running berks in the builder (i.e. I want the cookbooks resolved by the host running packer), and would ideally like to have this run solely in packer as apposed to wrapper scripts that run berks first. I have resolved this for now by vendoring into cookbooks, but would like to avoid this route if possible too.
Just create an empty directory vendor/cookbooks.
Is there a mechanism in packer that will run the shell-local first before resolving the chef-solo provisioner?
No
and would ideally like to have this run solely in packer as apposed to wrapper scripts that run berks first.
I would recommend to reconsider if you have another problem like this. Packer tries to do one thing well and leave allot out that could better be solved by a wrapper script.
i use intellij idea to run the nifi source code from bootstrap/.../RunNiFi.main("start"): github link, but it get wrong like this: picture here
can you help me ?
thanks
and should i run the code from bootstrap/.../RunNiFi.main("start")?
You won't be able to launch a single Java class from your IDE to start NiFi, the classpath won't be setup correctly.
You'll need to run a full build and get the assembly from nifi-assembly/target and then run NiFi from the built assembly by using "bin/nifi.sh start".
You can run it directly from the unpacked assembly target, for example:
cd nifi-assembly/target/nifi-1.6.0-SNAPSHOT-bin/nifi-1.6.0-SNAPSHOT
./bin/nifi.sh start
EDIT: You can still use debugger, but it will be debugging a remote Java application. In NiFi's bootstrap.conf, uncomment the following line and restart NiFi:
#java.arg.debug=-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=8000
This tells the NiFi JVM to listen for incoming debug requests on port 8000.
In your IDE, create a Remote debugging configuration and connect to localhost port 8000. Your break points should work like normal.
if you can read chinese, you can read my personal blog link here
I found the methods,
if you want remote debug nifi, please follow remote debug link.
if you want local debug, you can follow bellow.
(I used Intellij idea in windows to debug)
1.
$ git config --global core.longpaths true
$ git config --global core.autocrlf false
open intellij idea to git clone https://github.com/apache/nifi,
picture here,
set the import configure as default, next...
after opened the project, the nifi may get wrong message, just ignore it.
mvn -T 2 clean install -DskipTests
configure debugger
picture here,
picture here
acknowledge:
nifi quick start link: https://nifi.apache.org/quickstart.html
Running NiFi in Debug mode link: https://cwiki.apache.org/confluence/display/NIFI/Contributor+Guide#ContributorGuide-RunningNiFiinDebugmode
I want to run a grunt.js file inside a folder that is inside a netbeans project.
Test 1:
when I check if grunt is working correct in c:\Users\someuser: grunt -v
it gets me the grunt version info.
Test 2:
when Im inside the project folder with grunt inside
c:\wamp\www\project\grunt\ (with grunt.js inside)
I run grunt or grunt -v (just for testing and I get this)
The launcher has determined that the parent process has a console and will reuse
it for its own console output.
Closing the console will result in termination of the running program.
Use '--console suppress' to suppress console output.
Use '--console new' to create a separate console window.**
What can I do to fix this?
Thank you
fixed. My mistake. Mi grunt file was Grunt.js not Gruntfile.js