Where do I put common server-side JavaScript files used by most of my jobs? I do not want to get fancy and create a new Node module, I just need a place to put a couple of utility functions.
A Node module is the only way that works I could find. I created a .js file (for instance utils.js) in the package folder where jobs/ and widgets/ are, and put all my common code for export:
module.export = {
commonFunction: function () { ... },
:
};
In my jobs, I import the common code I need with:
var utils = require('../../utils.js');
and use the exported properties offered by utils.
Related
I would like to structure my project in several packages. Each package should be each own namespace (so as to avoid conflicting filename) but within a package, I want everything to be in the same namespace (without having to put all the files in the same folder; I'd like different folders).
In practice I would like this structure
Project
main.m
commonLibrary
+part1Project
mainPart1.m
otherFolder
supportFile.m
+part2Project
mainPart2.m
otherFolder2
supportFile2.m
This is the behavious I would like:
When in main.m, I can call everything in common library and everything in any sub-project, including the functions inside the subfolder. So I would like to call part1Project.supportFile
When in mainPart1.m, I want to call the support files without using the prefix of the current package (i.e. I want to call supportFile directly)
When in mainPart2, I want to call supportFile2 directly. If I want access to files in the the part 1 of the project, I can call part1Project.supportFile.
The current setup is that I added the Project folder and all the subfolders to the matlab path. But this means that
I CANNOT call supportFile from anywhere; not from main (part1Project.supportFile will not work) and not even from mainPart1 (supportFile can't be found)
Much in the same way, it is hard to access elements of part1Project from part2Project
How can I achieve the behaviour I want?
You cannot access functions within a subfolder of a package unless that subfolder is a private folder in which case it will only be accessible to the functions in the immediate parent folder.
If you do use the private folder approach, then you can call functions within this private folder from the functions in the containing folder without using the fully-qualified package name.
Your layout would look like:
Project
main.m
commonLibrary
+part1Project
mainPart1.m
private
supportFile.m
+part2Project
mainPart2.m
private
supportFile2.m
Your first point will not work but the other two will. There is no built-in way to accomplish the first point.
Another option would be to use import statements in all functions within each package such that it imports all package members at the beginning of the function.
Your layout would look like
Project
main.m
commonLibrary
+part1Project
mainPart1.m
supportFile.m
+part2Project
mainPart2.m
supportFile2.m
And the contents of mainPart1.m (any any function) would look something like:
function mainPart1()
% Import the entire namespace
import part1Project.*
% No package name required
supportFile()
end
And then from main you could access supportFile
function main()
part1Project.supportFile()
end
I just started using puppet. I don't know how to execute classes in puppet.
I've my files "config.pp init.pp install.pp service.pp".
For example install.pp :
class sshd::install{ ... }
Next, i declare my class in init.pp with "include sshd::install".
I also tried to run classes with :
class{'sshd::install':} -> class{'sshd::config':} ~> class{'sshd::service':}
After that, i launch "puppet apply init.pp" but nothing.
My scripts work individualy, but with classes i don't know how to execute all my classes.
Thanks
I'm not sure how much research you've done into Puppet and how its code is structured, but these may help:
Module Fundamentals
Digital Ocean's guide.
It appears that you are starting out with a basic module structure (based on your use of init/install/service), which is good, however your execution approach is that of a direct manifest (Not the module itself) which won't work within the module you are testing due to autoloading unless your files are inside a valid module path.
Basically: You want to put your class/module structured code within Puppet's module path (puppet config print modulepath) then you want to use another manifest file (.pp) to include your class.
An example file structure:
/etc/puppetlabs/code/modules/sshd/manifests/init.pp
install.pp
service.pp
/tmp/my_manifest.pp
Your class sshd(){ ... } code goes in the init.pp, and class sshd::install(){ ... } goes in install.pp etc...
Then the 'my_manifest.pp' would look something like this:
include ::sshd
And you would apply with: puppet apply /tmp/my_manifest.pp.
Once this works, you can learn about the various approaches to applying manifests to your nodes (direct, like this, using an ENC, using a site.pp, etc... Feel free to do further reading).
Alternatively, as long as the module is within your modulepath (as mentioned above) you could simply do puppet apply -e 'include ::sshd'
In order to get the code that you have to operate the way you are expecting it to, it would need to look like this:
# Note: This is BAD code, do not reproduce/use
class sshd() {
class{'sshd::install':} ->
class{'sshd::config':} ~>
class{'sshd::service':}
}
include sshd
or something similar, which entirely breaks how the module structure works. (In fact, that code will not work without the module in the correct path and will display some VERY odd behavior if executed directly. Do not write code like that.)
Is the code in ES6 modules executed every time we import a module? I'm using webpack and it seems that it does exactly that.
// FormStore.js
import sessionActions from "../../session/actions/session";
// session.spec.js
import sessionActions from "../../../src/session/actions/session";
This causes the code in the session module to be executed twice
I don't know exactly the answer, but I suspect it has to do with karma. I think it's due to having two different bundles.
In karma.config
preprocessors: {
"client/specs/index.ts": ["webpack"],
"client/specs/**/*spec.ts": ["webpack"]
},
webpack: {
entry: {
index: "./client/src/index.tsx",
vendor: []
}
},
Basically, I don't really need to add the index entry point, as this will probably create an additional bundle.
Do you want your code to be executed when imported? If in your /session/actions/session files there is a function call being exported rather than function declaration then it will be called when imported in any bundle you created. You can import different modules in different files.
If you are using karma-webpack
the usage
tells you that
webpack: {
// karma watches the test entry points
// (you don't need to specify the entry option)
// webpack watches dependencies
// webpack configuration
},
A coworker was experiencing the same issue today- the problem seemed to be caused by two imports of one package, but each import referencedthat package using a different path. You appear to be doing the same thing:
// FormStore.js
import sessionActions from "../../session/actions/session"; // note first path
// session.spec.js
import sessionActions from "../../../src/session/actions/session"; // note second path, which is different from first
Is it possible for the import in session.spec.js to import using the same path? If this isn't an option due to relative locations of the files, can you configure your module loader so that sessionActions is aliased (using map or path, for instance). I'm not sure if this is an option in your environment, however.
See:
https://github.com/systemjs/systemjs/blob/master/docs/config-api.md#map
for how to do this using system.js
Grunt files must either be "Gruntfile.js" or "Gruntfile.coffee". So, how can I write my Gruntfile using literate coffeescript instead of vanilla coffeescript (since, I believe, literate coffeescript files need to be named with a .litcoffee at the end instead of just .coffee)?
Make this your Gruntfile.coffee. You could just as easily do it as a js file of course, as long as node knows what litcoffee is supposed to parse like (that's why you require coffee-script)
coffee = require 'coffee-script'
module.exports = require './Gruntfile.litcoffee'
This is under the assumption that the litcoffee file's export is the (grunt) -> function
You could have a Gruntfile.coffee act as a bootstrapper for the Gruntfile.litcoffee file, something like this pseudo code...
coffee = require "coffee-script"
module.exports = eval coffee.compile "Gruntfile.litcoffee"
I am totally new to Perl/Fastcgi.
I have some pm-modules to which will have to add a lot of scripts and over time it will grow and grow. Hence, I need a structure which makes the admin easier.
So, I want to create files in some kind of directory structure which I can include. I want the files that I include will be exaclty like if the text were written in the file where I do the include.
I have tried 'do', 'use' and 'require'. The actual file I want to include is in one of the directories Perl is looking in. (verified using perl -V)
I have tried within and outside BEGIN {}.
How do I do this? Is it possible at all including pm files in pm files? Does it have to be pm-files I include or can it be any extension?
I have tried several ways, included below is my last try.
Config.pm
package Kernel::Config;
sub Load {
#other activities
require 'severalnines.pm';
#other activities
}
1;
severalnines.pm
# Filter Queues
$Self->{TicketAcl}->{'ACL-hide-queues'} = {
Properties => {
},
PossibleNot => {Ticket => { Queue =>
['[RegExp]^*'] },
},
};
1;
I'm not getting any errors in the Apache's error_log related to this. Still, the code is not recognized like it would be if I put it in the Config.pm file.
I am not about to start programming a lot, just do some admin in a 3rd party application. Still, I have searched around trying to learn how it works with including files. Is the severalnines.pm considered to be a perl module and do I need to use a program like h2xs, or similar, in order to "create" the module (told you, totally newbie...)?
Thanks in advance!
I usually create my own module prefix -- named after the project or the place I worked. For example, you might put everything under Mu with modules named like Mu::Foo and Mu::Bar. Use multiple modules (don't try to keep everything in one single file) and name your modules with the *.pm suffix.
Then, if the Mu directory is in the same directory as your programs, you only need to do this:
use Mu::Foo;
use Mu::Bar;
If they're in another directory, you can do this:
use lib qw(/path/to/other/directory);
use Mu::Foo;
use Mu::Bar;
Is it possible at all including pm files in pm files?
Why certainly yes.
So, I want to create files in some kind of directory structure which I can include. I want the files that I include will be exaclty like if the text were written in the file where I do the include.
That's a bad, bad idea. You are better off using the package mechanism. That is, declare each of your module as a separate package name. Otherwise, your module will have a variable or function in it that your script will override, and you'll never, ever know it.
In Perl, you can reference variables in your modules by prefixing it with the module name. (Such as File::Find does. For example $File::Find::Name is the found file's name. This doesn't pollute your namespace.
If you really want your module's functions and variables in your namespace, look at the #EXPORT_OK list variable in Exporter. This is a list of all the variables and functions that you'd like to import into your module's namespace. However, it's not automatic, you have to list them next to your use statement. That way, you're more likely to know about them. Using Exporter isn't too difficult. In your module, you'd put:
package Mu::Foo;
use Exporter qw(import);
our EXPORT_OK = qw(convert $fundge #ribitz);
Then, in your program, you'd put:
use Mu::Foo qw(convert $fundge #ribitz);
Now you can access convert, $fundge and #ribitz as if they were part of your main program. However, you now have documented that you're pulling in these subroutines and variables from Mu::Foo.
(If you think this is complex, be glad I didn't tell you that you really should use Object Oriented methods in your Modules. That's really the best way to do it.)
if ( 'I want the files that I include will be exactly like if the text were written in the file where I do the include.'
&& 'have to add a lot of scripts and over time it will grow and grow') {
warn 'This is probably a bad idea because you are not creating any kind of abstraction!';
}
Take a look at Exporter, it will probably give you a good solution!