I have an dancer webapp that I'm trying to test, and I have a few issues related to paths. It appears that the appdir setting isn't being set correctly.
The top of my test code is:
use MyApp;
use Dancer::Test;
if I dump out the settings (using dd from Data::Dump) from:
my $settings = Dancer::Config::settings();
dd $settings;
I get:
{ appdir => "/Library/WebServer/Documents/myapp/lib",
apphandler => "Standalone",
auto_reload => 0,
charset => "",
confdir => "/Library/WebServer/Documents/myapp/lib",
content_type => "text/html",
daemon => 0,
engines => {},
envdir => "/Library/WebServer/Documents/myapp/lib/environments",
environment => "development",
handlers => {},
logger => "file",
plugins => {},
port => 3000,
public => "/Library/WebServer/Documents/myapp/lib/public",
server => "0.0.0.0",
server_tokens => 1,
startup_info => 1,
template => "simple",
traces => 0,
views => "/Library/WebServer/Documents/myapp/lib/views",
warnings => 0}
clearly it's not setting the appdir correctly. It doesn't appear that it matters how I call code (i.e. the working directory).
I've been calling it as perl -I lib -I ../lib t/001_base.t with the additional library calls as MyApp.pm needs modules in both lib and ../lib.
When I run the same code as a standalone webapp which is the following in bin/app.pl which has the code:
use Dancer;
use MyApp;
dance;
dumping the settings gives me the right appdir, which loads the configuration file and sets all of the other correctly.
In my test code, I thought that adding the lines:
Dancer::set appdir=>"/Library/WebServer/Documents/myapp"
Dancer::Config->load;
would do it. However that sets the appdir correctly, but doesn't change any other parameters. From the Dancer::Test code, there is an import command which appears to do what I want, but didn't help at all. Any other thoughts?
Related
I try to use helhum/dotenv-connector in my TYPO3 Project.
I have done the following:
my composer.json:
{
"require": {
"typo3/cms": "^8.5",
"helhum/dotenv-connector": "1.0.0",
"helhum/typo3-console": "^4.1"
},
"extra": {
"helhum/typo3-console": {
"install-extension-dummy": false
},
"typo3/cms": {
"cms-package-dir": "{$vendor-dir}/typo3/cms",
"web-dir": "web"
},
"helhum/dotenv-connector": {
"env-dir": "",
"allow-overrides": true,
"cache-dir": "var/cache"
}
}
}
Then I ran
composer install
After that I setup the TYPO3 using the command
php vendor/bin/typo3cms install:setup
This should be similar with doing the install the "normal" way.
After that, i placed a .env next to my composer.json
This .env contains the following:
TYPO3_CONTEXT="Development"
TYPO3__DB__database="dotenvconnector"
TYPO3__DB__host="127.0.0.1"
TYPO3__DB__password="root"
TYPO3__DB__port="3306"
TYPO3__DB__username="root"
Then i removed all informations about the DB from web/typo3conf/LocalConfiguration.php using the typo3_console-command
php vendor/bin/typo3cms configuration:remove DB
I then ran composer install and composer update again.
When calling the TYPO3 in the browser now, it keeps telling me
The requested database connection named "Default" has not been configured.
So what am i missing? Obviously my .env is not parsed or used at all.
FYI: Cachefile is written in var/cache with the following content:
<?php
putenv('TYPO3__DB__database=dotenvconnector');
$_ENV['TYPO3__DB__database'] = 'dotenvconnector';
$_SERVER['TYPO3__DB__database'] = 'dotenvconnector';
putenv('TYPO3__DB__host=localhost');
$_ENV['TYPO3__DB__host'] = 'localhost';
$_SERVER['TYPO3__DB__host'] = 'localhost';
putenv('TYPO3__DB__password=root');
$_ENV['TYPO3__DB__password'] = 'root';
$_SERVER['TYPO3__DB__password'] = 'root';
putenv('TYPO3__DB__port=3306');
$_ENV['TYPO3__DB__port'] = '3306';
$_SERVER['TYPO3__DB__port'] = '3306';
putenv('TYPO3__DB__username=root');
$_ENV['TYPO3__DB__username'] = 'root';
$_SERVER['TYPO3__DB__username'] = 'root';
Our setups work like this:
AdditionalConfiguration.php
$loader = new Dotenv\Dotenv(__DIR__ . '/../../', '.env.defaults');
$loader->load();
$loader = new Dotenv\Dotenv(__DIR__ . '/../../');
$loader->overload();
Interesting to see here that we run with a .env.defaults file that holds the standard config (no users or passwords of course) which we then overload with the custom .env file per user/environment.
This helps a lot when adding new functionality which requires a new .env configuration so other people on the team don't run into Fatals or Exceptions.
$GLOBALS['TYPO3_CONF_VARS']['DB']['Connections']['Default']['dbname'] = getenv('TYPO3_DB_NAME');
$GLOBALS['TYPO3_CONF_VARS']['DB']['Connections']['Default']['host'] = getenv('TYPO3_DB_HOST');
$GLOBALS['TYPO3_CONF_VARS']['DB']['Connections']['Default']['password'] = getenv('TYPO3_DB_PASSWORD');
$GLOBALS['TYPO3_CONF_VARS']['DB']['Connections']['Default']['user'] = getenv('TYPO3_DB_USER');
LocalConfiguration.php
return [
'BE' => [
'debug' => '<set by dotenv>',
'explicitADmode' => 'explicitAllow',
'installToolPassword' => '<set by dotenv>',
'loginSecurityLevel' => 'rsa',
'sessionTimeout' => '<set by dotenv>',
],
'DB' => [
'Connections' => [
'Default' => [
'charset' => 'utf8',
'dbname' => '<set by dotenv>',
'driver' => 'mysqli',
'host' => '<set by dotenv>',
'password' => '<set by dotenv>',
'port' => 3306,
'user' => '<set by dotenv>',
],
],
]...
I didn't paste the entire config but I think you get the point.
The dotenv-connector reads the .env file into the environment, but does not assign any values to TYPO3 configuration variables. You should be able to read them with getenv in your php code.
The connector is not specifically geared towards TYPO3, but is a general tool for any composer based php application. Therefore it would be out of the scope of the project, to know about the TYPO3 specific variable assignments.
There is another project, the configuration loader, that can help to assign environment variables to TYPO3 configuration variables.
.env -dotenv-connector-> environment -configuration-loader-> $GLOBALS['TYPO3_CONF_VARS']
The configuration loader can be found at https://github.com/helhum/config-loader . And an example of it all wired together in https://github.com/helhum/TYPO3-Distribution .
You don't have to use the configuration loader. You could also assign the values manually with getenv().
One important note with PHP 7.2 (on TYPO3 v9) and the usage of argon hash:
You must use single quotes / ticks for the values in the .env file.
Example:
Instead of my_value="foobar"
write my_value='foobar'
Data defined inside Catalyst app or in templates has correct encoding and is diplayed well, but from database everything non-Latin1 is converted to ?. I suppose problem should be in model class, which is such:
use strict;
use base 'Catalyst::Model::DBIC::Schema';
__PACKAGE__->config(
schema_class => 'vhinnad::Schema::DB',
connect_info => {
dsn => 'dbi:mysql:test',
user => 'user',
password => 'password',
{
AutoCommit => 1,
RaiseError => 1,
mysql_enable_utf8 => 1,
},
'on_connect_do' => [
'SET NAMES utf8',
],
}
);
1;
I see no flaws here, but something must be wrong. I used my schema also with test scripts and data was well encoded and output was correct, but inside Catalyst app i did not get encoding right. Where may be the problem?
EDIT
For future reference i put solution here: i mixed in connect info old and new style.
Old style is like (dsn, username, passw, hashref_options, hashref_other options)
New style is (dsn => dsn, username => username, etc), so right is to use:
connect_info => {
dsn => 'dbi:mysql:test',
user => 'user',
password => 'password',
AutoCommit => 1,
RaiseError => 1,
mysql_enable_utf8 => 1,
on_connect_do => [
'SET NAMES utf8',
],
}
In a typical Catalyst setup with Catalyst::View::TT and Catalyst::Model::DBIC::Schema you'll need several things for UTF-8 to work:
add Catalyst::Plugin::Unicode::Encoding to your Catalyst app
add encoding => 'UTF-8' to your app config
add ENCODING => 'utf-8' to your TT view config
add <meta http-equiv="Content-type" content="text/html; charset=UTF-8"/> to the <head> section of your html to satisfy old IEs which don't care about the Content-Type:text/html; charset=utf-8 http header set by Catalyst::Plugin::Unicode::Encoding
make sure your text editor saves your templates in UTF-8 if they include non ASCII characters
configure your DBIC model according to DBIx::Class::Manual::Cookbook#Using Unicode
if you use Catalyst::Authentication::Store::LDAP configure your LDAP stores to return UTF-8 by adding ldap_server_options => { raw => 'dn' }
According to Catalyst::Model::DBIC::Schema#connect_info:
The old arrayref style with hashrefs for DBI then DBIx::Class options is also supported.
But you are already using the 'new' style so you shouldn't nest the dbi attributes:
connect_info => {
dsn => 'dbi:mysql:test',
user => 'user',
password => 'password',
AutoCommit => 1,
RaiseError => 1,
mysql_enable_utf8 => 1,
on_connect_do => [
'SET NAMES utf8',
],
}
This advice assumes you have fairly up to date versions of DBIC and Catalyst.
This is not necessary: on_connect_do => [ 'SET NAMES utf8' ]
Ensure the table|column charsets are UTF-8 in your DB. You can achieve things that sometimes look right even when parts are broken. The DB must be saving the character data as UTF-8 if you expect the entire chain to work.
Ensure you're using and configuring Catalyst::Plugin::Unicode::Encoding in your Catalyst app. It did have serious-ish bugs in the not too distant past so get the newest.
I want to convert a project from use of ExtUtils::MakeMaker to Module::Build.
As the Makefile.PL is mostly default and Module::Build::Convert did not work for me (see below) I want to convert it manually but did not find the equivalent of INST_SCRIPT to place the executables in Perl's bin/ directory.
My WriteMakefile looks like this.
WriteMakefile(
NAME => 'Project',
AUTHOR => q{Mugen Kenichi <mugen.kenichi#uninets.eu>},
VERSION_FROM => 'lib/Project.pm',
INST_SCRIPT => 'script/',
($ExtUtils::MakeMaker::VERSION >= 6.3002
? ('LICENSE'=> 'perl')
: ()),
PL_FILES => {},
PREREQ_PM => {
'JSON' => 0,
'Log::Log4perl' => 0,
'Proc::Daemon' => 0,
'Term::ANSIColor' => 0,
'MooseX::Declare' => 0.34,
'MooseX::Log::Log4perl' => 0,
'Moose::Util::TypeConstraints' => 0,
'MooseX::Templated::Role' => 0,
'Template' => 0,
# for testing
'Test::More' => 0,
'MooseX::Params::Validate' => 0,
'File::Temp' => 0,
'Sub::Exporter::ForMethods' => 0,
'Data::Section' => 0,
},
dist => { COMPRESS => 'gzip -9f', SUFFIX => 'gz', },
clean => { FILES => 'Project-*' },
);
I tried to use Module::Build::Convert but make2build throws errors i could not resolve:
Variable "$regex" will not stay shared at (re_eval 32) line 1.
Use of uninitialized value $lines[0] in pattern match (m//) at /home/mak/perl5/lib/perl5/Module/Build/Convert.pm line 1305, <DATA> line 1.
perl version:
perl -v
This is perl 5, version 12, subversion 3 (v5.12.3) built for x86_64-linux
If that's what your Makefile.PL looks like, leave it like that. Don't switch to Module::Build, which appears to be an abandoneed build system. No one maintains Module::Build anymore, and until Leon Timmermans comes out with the next thing, unless there's some feature in Module::Build you absolutely must have, there's no reason to convert to it.
Having said that though, I create the list of script files and use it as the value for script_files. It's not as nice. See my Build.PL for Unicode::Tussle.
I am developing a library and scripts in perl. For distribution I am using ExtUtils::MakeMaker, I have some configuration and data files in a directory called data in the distribution path, for example the config file is data/config.ini and data files like: data/inv01.stb. A part of the Makefile.PL code follows:
use ExtUtils::MakeMaker;
my $inifile = 'data/config.ini';
my #data = <data/*.stb>;
WriteMakefile(
NAME => 'Mymodule',
VERSION_FROM => 'lib/Mymodule.pm',
PREREQ_PM => {
'Time::HiRes' => 0,
'Storable' => 0,
'File::Path', => 0,
'File::Copy', => 0,
'Digest::CRC', => 0,
'Digest::MD5', => 0,
'Archive::Tar', => 0,
},
EXE_FILES => [ qw(scripts/check_requests.pl scripts/proc_requests.pl scripts/send_requests.pl) ],
'clean' => {FILES => clean_files()},
);
# Delete *~ files
sub clean_files {
return join(" ", "*.out", "*~", "data/test/*");
}
How can I configure the Makefile.PL to copy those files in non standard directory.
thanks for your help
Why not use EXE_FILES? After all, they are not going to be checked for runability.
I'm trying to use Apache::Session::Memcached in an HTML::Mason project where I'm using MasonX::Request::WithApacheSession to handle my sessions. Unfortunately Apache will not launch when I plug in the Memcached module instead of the MySQL one. My custom handler looks something like this (a few snips here and there):
my $ah = HTML::Mason::ApacheHandler->new (
comp_root => $ENV{HTDOCS},
data_dir => $data_dir,
request_class => 'MasonX::Request::WithApacheSession',
session_use_cookie => 0,
args_method => "mod_perl",
session_args_param => 'session_id',
session_class => 'Apache::Session::Memcached',
session_Servers => '127.0.0.1:20000',
session_Readonly => 0,
session_Debug => 1,
session_cookie_domain => $CONF->{global}->{site_name},
session_cookie_expires => "session",
session_allow_invalid_id => 0,
);
The problem I'm running into is that the session_* paramaters specific to Memcached are not being passed through to Apache::Session::Memcached like the docs say it should. This results in this error:
The following parameter was passed in the call to HTML::Mason::ApacheHandler->new()
but was not listed in the validation options: session_Servers
Now, I have gone through and swapped all of the 3 upper case arguments to lower case, to no avail. And the docs for Apache::Session::Memcached list them as upper case.
Thanks a ton for any help.
It looks like you need to register Apache::Session::Memcached with Apache::Session::Wrapper, following the instructions at http://search.cpan.org/perldoc/Apache::Session::Wrapper#REGISTERING_CLASSES like so (code courtesy Jack M.):
Apache::Session::Wrapper::->RegisterClass(
'name' => 'Apache::Session::Memcached',
'required' => [ [ 'Servers' ], ],
'optional' => [ 'NoRehash', 'Readonly', 'Debug', 'CompressThreshold', ],
);