The scheduler task is normally executed via the command line script /typo3/sysext/core/bin/typo3 scheduler:run.
Is it possible to call the task inside/with a frontend controller from an extbase extension?
(as it is, from the backend module "Scheduler" by clicking "Execute selected tasks")
The goal is to run the task on every pageview of a specific page. (cache disabled)
(No exec/shell_exec available)
Thank you!
The scheduler run script is a Symfony Command Controller. You can run it in your Controller using the following code:
$schedulerCommand = new \TYPO3\CMS\Scheduler\Command\SchedulerCommand();
$schedulerInput = new \Symfony\Component\Console\Input\ArrayInput([]);
$schedulerOutput = new \Symfony\Component\Console\Output\NullOutput();
$schedulerCommand->run($schedulerInput, $schedulerOutput);
Depending on which tasks you execute, it can take a while to execute and you might run into timeouts.
Related
I've created one batch account(demo-batch-account) and one batch job(TEST_JOB).
and created relevant application pool and package all are mapped.
Consider,
Application pool name: DemoTestPool
Application package name: TestTool.zip
To be run command as: cmd /c %AZ_BATCH_APP_PACKAGE_TestTool#1%\TestTool.exe
Now, this command I need to run by adding new Task under TEST_JOB via ADF pipeline.
Kindly suggest me to achieve this.
Can you please specify which pool you are talking about, is it SQL pool?
im not sure if i understand your question , do u want to create a dynamic task and run the package in this task , or create a task and add a command to run the package? (the second one makes more sense to me)
unfortunately ,there are no built-in activity to run zip files yet , but you can do it using Custom Activity.
if you want to run your package using ADF pipline , you need to do 3 things :
create a pipline ,in ADF,on the left side open:
Author hub -> piplines -> click on new pipline -> in search tab type "custom activity" and drag it to the board.
click on Custom activity , under settings tab , you can write the command that you need such as : "cmd /c %AZ_BATCH_APP_PACKAGE_TestTool#1%\TestTool.exe"
click on validate + debug and make sure you configure the custom activity correctly
please check this : https://learn.microsoft.com/en-us/azure/data-factory/transform-data-using-custom-activity
I have created multiple schedule tasks in laravel 5, and created cron job on cpanel it is working fine. But now i want to stop specific schedule task, i have comment the command and remove class from app/Console/kernel.php file, but still it is running on live server on that specific time.
Befor in kernel.php
protected $commands = [
Commands\CreatePostingSchedules::class,
Commands\ChangeCreatedPostDuration::class,
Commands\abc::class,
Commands\xyz::class,
Commands\NewMonthUser::class,
Commands\DeliverOrdersWithCourier::class,
];
Now Remove the class Commands\ChangeCreatedPostDuration::class
After Removed the class:
protected $commands = [
Commands\CreatePostingSchedules::class,
Commands\abc::class,
Commands\xyz::class,
Commands\NewMonthUser::class,
Commands\DeliverOrdersWithCourier::class,
];
But still it is running on live server.
Please anyone can help ? how to stop this specific schedule task ?
Thanks
Make sure that you have completely removed the scheduled function/command from "Kernel.php" of your Laravel 5 and that you do not have related manual CronJob in your cPanel -> CronJobs.
You might also want to check this article: https://laracasts.com/discuss/channels/laravel/how-to-stop-scheduled-tasks-from-running-in-kernelphp
In NetSuite, have a Restlet script that calls a deployed map/reduce script but the map stage shows as Failed when looking at details of status page (the getInputData stage does run and shows as Complete).
However, if I do a "Save and Execute" from the deployment of the map/reduce script, it works fine (Map stage does run).
Note that:
There is no error on execution log of either the restlet or the map/reduce scripts.
Have 'N/task' in define section of restlet as well as task in function parameters.
The map/reduce script has the Map checkbox checked. The map/reduce script deployment is not scheduled and has default values for other fields.
Using NetSuite help "See the quick brown fox jump" sample map/reduce script
Using Sandbox account
Using a custom role to post to restlet
Below is call to create.task code snippet from my Restlet. Don't know what is wrong, any help is appreciated.
var mrTask = task.create({
taskType: task.TaskType.MAP_REDUCE,
scriptId: 'customscript_test',
deploymentId: 'customdeploy_test'
});
var mrTaskId = mrTask.submit();
var taskStatus = task.checkStatus(mrTaskId);
log.debug('taskStatus', taskStatus);
You also need Documents and Files -View permission along with SuiteScript - View and SuiteScript Scheduling - Full permissions to access the Map/Reduce script.
The help for MapReduceScriptTask.submit() does not mention this but the help for ScheduledScriptTask.submit() does:
Only administrators can run scheduled scripts. If a user event script calls ScheduledScriptTask.submit(), the user event script has to be deployed with admin permissions.
I did a test of posting to my restlet using Administrator role credentials and it works fine as opposed to using my other custom role. Maybe just like ScheduledScriptTask, the MapReduceScriptTask can only be called by Administrator role ? My custom role does have SuiteScript - View and SuiteScript Scheduling - Full permissions. Thought that would do the trick but apparently not.
Anyone have any further insight on this ?
I am new to webistrano so apologies if this is a trivial matter...
I am using webistrano to deploy php code to several production servers, this is all working great. My problem is that I need to clear HTML cache on my cache servers (varnish cache) after the code update. I can't figure out how to build a recipe that will be executed on the webistrano machine (and will run the relevant shell script that will clear the cache) and not on each of the deployment target machines.
Thanks for the help,
Yariv
Simpliest method is to execute varnishadm tool with proper parameters inside deploy:restart
set :varnish_ban_pattern, "req.url ~ ^/"
set :varnish_terminal_address_port, "127.0.0.1:6082"
set :varnish_varnishadm, "/usr/bin/varnishadm"
task :restart, :roles => :web do
run "#{varnish_varnishadm} -T #{varnish_terminal_address_port} ban \"#{varnish_ban_pattern}\""
end
Thanks for the answer. I actually need to do some more stuf than to only clear the the cache so I will execute a bash script locally as described in below:
How do I execute a Capistrano task locally?
I have a template for a Rails site for Sphinx configuration. There can be multiple different Sphinx services on the same machine running on different ports, one per app. Therefore, I I only want to restart Sphinx for each site if their corresponding configuration template changes. I've created an /etc/init.d/sphinx script that restarts just one sphinx based on a parameter similar to:
/etc/init.d/sphinx restart /etc/sphinx/site1.conf
Where site1.conf is defined by a Chef template. I'd really love to use the notifies functionality for Chef Templates to pass in the correct site1.conf parameter if the template changes. Is this possible?
Alternatively, I suppose I could just register a different service for each site similar to:
/etc/init.d/sphinx_site1
However, I'd prefer to pass in the parameters to the script instead.
When defining a service resource, you can customize the start, stop, and restart commands that will be run. You can define a service resource for each site that you have using these customized commands and set up their corresponding notifications.
For example:
service "sphinx_site1" do
supports :restart => true
restart_command "/etc/init.d/sphinx restart /etc/sphinx/site1.conf"
action :nothing
end
template "/etc/sphinx/site1.conf" do
notifies :restart, "service[sphix_site1]"
end