How can I write a script which calls my PHP code?
This script should contain records which are in Filemaker Database.
You can use the "Insert from URL" script step to call PHP and make it do something. I was able to use this in a server-side script to call a PHP script with some params in the URL. This allowed me to have Filemaker talk to a PHP API of another service. Not sure if that answers your question (or if it is even still a question), but thought that I'd share.
I had a similar problem, where I wanted a FileMaker action to trigger a script that would alert a website that something had changed. I was working in FileMaker Pro 11 so Insert from URL wasn't an option (though how I wish it had been)!
I created a FileMaker script called Sync and gave it one action: Perform AppleScript (obviously this works on Macs only; you can achieve something similar using Send DDE Execute in Windows).
In the Script Step Options, I set my Perform Applescript to be a Calculated AppleScript, and set it as follows:
"do shell script \"curl http://example.com/sync.php?id=" & Get( RecordID ) &
"\\\&layout=" & GetAsURLEncoded( Get( LayoutName ) ) & "\\\&table=" &
GetAsURLEncoded( Get( LayoutTableName ) ) & "\""
Note all the quotes and escaping of quotes.
To cause this to execute when something changes, go to File » Manage » Layouts and select the layout where a user might be editing a record, then click Edit. In the resulting Layout Setup dialog, go to Script Triggers and for the event OnRecordCommit, select the Sync script you just created.
Now whenever a record is modified in that layout, the Sync script will run which executes the AppleScript which executes the shell command curl, sending a GET request something to the effect of:
curl http://example.com/sync.php?id=123&layout=Edit%20Records&table=Records
From there, create a sync.php to do something intelligent whenever it’s loaded, using $_GET['id'] and $_GET['layout'] and $_GET['table'] or similar passed-through variables.
One caveat with this approach is that FileMaker is frozen while the shell script executes, which in this case means until curl either receives a response or times out. While you can set the timeout to be very low (add arguments --output /dev/null --silent --head --fail --connect-timeout 1), it still causes a delay to the user of a second or so, which can be noticeable if a user is editing lots of records. If anyone has a solution for this, or a way to cause the script to run asynchronously in the background, please let me know.
Yes, there is a PHP API available for FileMaker:
http://www.filemaker.com/downloads/pdf/article2_php.pdf
Before you work on adding coding for your own file, try getting the sample API code to work with the FMServer_Sample database.
Related
I'm working with an existing framework of WinDbg scripts that go through a series of test scripts Test1.txt, Test2.txt, etc., which are generated by C++ code and which output results.
For example a chunk of one of the test scripts would be,
.if (($spat(#"${var}","18300.000000")==1))
{
.logappend C:\Tests\TestResults.txt
.printf "TestNumber=\t1\tExpected=\t18300.000000\tActual=\t%.6f\t******PASSED******\n",poi(poi(#$t2+#$t6)+0x10)
.logclose
}
I'm trying to add functionality that will create a file whose name displays the current # of the test being run, so that users can see their progress without needing to open a file.
My thought process was that I would set up the script generator, so that at the start of Test #N, it would add a line to the script to create a file 'currentlyRunningTestN.txt', and at the end of Test #N, it would add a line to the script to delete that file. However, I don't see any delete function in the WinDbg meta command glossary: https://learn.microsoft.com/en-us/windows-hardware/drivers/debugger/meta-commands, or in the list of supported C functions like printf. Am I just missing something, or is deleting files not supported by WinDbg (or equivalently renaming files, which would also serve my purpose?) If deleting/renaming don't work, is there another way to achieve the functionality I'm looking for?
With the .shell command, you can execute any DOS-like command. Although I never tried deleting a file, it should be possible.
As you may have noticed, WinDbg scripting does not always work on first attempt, please make sure your scripting will not result in a big data loss on your customer's PC whilst deleting files.
I have an exe that runs though Windows Console and prompts for responses for three questions. I created a batch file to contain criteria and would like to automate all three responses to the questions so selecting the bat file runs the data within the batch file.
I need to pass the following criteria
1)machine name
(Enter)
2)password
(Enter)
3)backup
(Enter)
I tried "machinename| exe" and it runs fine, and then brings up the prompt for 2)'s answer. I would like answer all three prompts and then run the exe.
Assuming all inputs are executed via stdin, then either a pipe or redirection should work for all three inputs.
The simplest method is to create a temporary response file and use redirection.
#echo off
>response.tmp (
echo machinename
echo password
echo backup
)
<response.tmp prog.exe
del response.tmp
It would seem it would be easy to use a pipe and get rid of the temp file
(echo machinename&echo password&echo backup)|prog.exe
But there is one problem - the parser inserts a space before each & and the ). This will probably break things.
Note that each side of the pipe is executed via cmd /c, so each side is parsed twice. It is the initial pipe parser that inserts the unwanted space.
The simplest way I have found to prevent the extra space is to delay the appearance of the & so that the parser initially thinks the entire left side is a single ECHO command.
#echo off
setlocal
set "+=&"
echo machinename%%+%%echo password%%+%%echo backup|prog.exe
EDIT
The fact that your program hangs at the password prompt implies that the password is read directly from the console, and not via stdin. In this case, you will need something like the freeware AutoIT utility.
perforce list out the files submitted by an user i.e.
p4 changes -u
what i would like to do is write a small function in perl which will find
what changes have been submitted by user
what operations has been performed for those changes i.e. add, edit or delete and print those .
so logic is something like this
1. find changes submitted by an user
2. what operation has been performed on those changes
result :
user has 5 change list submitted after date ...
5 add, 2 delete and 1 edit operation found in all changes.
It's not clear what you are stumbling on.
If it's connecting to perforce from Perl, you can use one of several CPAN modules for the purpose: see the first hits in googling for "CPAN perforce".
If you wish to parse STDOUT output from p4 changes -u command line instead, please post the exact output format from the command, what you tried so far to parse it, and desired matching output from your program.
We've currently got an issue where we're receiving a lot of bounced e-mails (from an auto generated e-mail) back from people where a specified e-mail address is not valid (failure notice). I need to identify certain messages in the mailbox and respond automatically to them - as a newbie to Powershell I'm struggling a bit! I think I understand how to check for the occurrence of a string but I don't know how to iterate through an inbox to look at/get a handle on each message in turn and I don't know how to extract the subject or body text in order to analyse the contents and perform a string comparison. I fear this should be easy - but I can't find anything on the web that might do the job - can anyone help?
So just to clarify what you're looking for.
Mailbox A receives a large number of failure notice/bounce messages.
You'ld like your powershell script to search Mailbox A for every instance where the Subject line (or message body) contains "String X" and if there is a match, take some action?
Also, what version of Exchange are you using? You need to be at least on 2007 to use Exchange Command Shell. You'll then want to look over the Command Shell commands that can be run.
Look at the Exchange Message Tracking Log, and Pipe the results from one command you run to the next. Think of it like this...
(Run a command) | (Run another command on the results of the first command) | (Run a last command on the results of the second).
You can view an example on my website at:
http://www.technoctopus.com/?p=223
While not exactly the same, it might get you moving in the right direction.
I have both Sybase and MSFT SQL Servers installed. There is a time when Sybase interferes with MS SQL because they have they have some overlapping commands.
So, I need two scripts:
A) When runs, script A backs up the current path, grabs all paths that contain sybase or SYBASE or SyBASE (you get the point) in them and move them all at the very end of the path, while preserving the order.
B) When it runs, script B restores the path from back-up.
Both script a and script b should affect the path immediately. So, if a.bat that calls patha.ps1, pathb.ps1 looks like so:
#REM Old path here
call patha.ps1
#REM At this point the effective path should be different.
call pathb.ps1
#REM Effective old path again
Please let me know if this does not make sense. I am not sure if call command is the best one to use.
I have never used P.S. before. I can try to formulate the same thing in Python (I know S.O. users tend to ask for "What have you tried so far"). Well, at this point I am VERY slow at writing anything in Power Shell language.
Please help.
First of all: call will be of no use here as you are apparently writing a batch file and PowerShell scripts have no association to run them by default. call is for batch files or subroutines.
Secondly, any PowerShell script you call from a batch file cannot change environment variables of the caller's environment. That's a fundamental property of how processes behave and since you are calling another process, this is never going to work.
I'm not so sure why you are even using a batch file here in the first place if you have PowerShell. You might just as well solve this in PowerShell completely.
However, what I get from your problem is that the best way to resolve this is probably the following: Create two batch files that each set the PATH appropriately. You can probably leave out both the MSSQL and Sybase paths from your usual PATH and add them solely in the batch files. Then create shortcuts to
cmd /k set_mssql_path.cmd
and
cmd /k set_sybase_path.cmd
each of which now is a shortcut to a shell to work with the appropriate database's tools. This is how the Visual Studio Command Prompt works and it's probably the cleanest solution you have. You can use the color and prompt commands in those batches to make the two different shells distinct so you always know what environment you have. For example the following two lines will color the console white on blue and set a prompt indicating MSSQL:
color 1f
prompt MSSQL$S$P$G
This can be quite handy, actually.
Generally, trying to rearrange the PATH environment variable isn't exactly easy. While you could trivially split at a ; this will fail for paths that itself contain a semicolon (and which need to be quoted then). Even in PowerShell this will take a while to get right so I think creating shortcuts specific to the tools is probably the nicest way to deal with this.