Filemaker Pro - SetVariable from text file - filemaker

Using FileMakerPro 12/13, I want to open an external file, extract values, and use these to set the values of some script variables.
The approach is:
• Use the import function to import data from a tab delimited file into a table, where first row has field names
• Open that table and go to the first record
• Copy the field values from the first record into the variables and use the variables as required
The problems are:
• When we run the import, it seems to auto-create a new “layout” each time .. we don’t want this to happen, or, need to auto-delete these layouts after they are created. Another possible approach is to drop/delete the import table, then allow the import to re-created it … this would stop the additional layout issue perhaps? … either way, we cannot find a script function to delete a named table or layout
• We are bringing the data successfully into the table, however, we are unable to get a function or set of functions working that reads the data from that table and assigns it to a variable.
Any assistance gratefully received!

The first problem is the result of importing into a new target table every time you import. Instead, you should create - once - a table named (for example) Variables, with the following fields:
ID
UserID
UserScore
and set your importing script (you have this scripted, right?) to import into this table. This will create a new record in the Variables table each time you import. You can delete this record when done. You cannot delete a table or a layout programmatically.
Regarding the second problem, use the Set Variable[] script step to "load" the imported values into variables, for example:
Set Variable [ $userID; Value: Variables:UserID ]
Note that immediately after import, the found set in the Variables table will contain only the imported record/s. So even if you do not delete the previously imported records, a script combining the two steps (import and set variables) will work just fine.

Related

Filemaker Pro 14 History tables

With a few solutions Ive worked with I've created temp table's or history tables. Normally I script it to take a handful of fields needed from a main table and copy it over to the other table by
Setting a variable then setting field to the variable for each field in the new table / new record.
I have a situation now, where Im building a history table that needs to copy the current record as is. A snapshot where all fields from that instance of the record are copied to the history table.
Rather then setting a variable then set field to the variable, Id like to get some input on a quicker way to get this done where I can do this on a record level and not type out field by field to get it done. Also if fields are added to both tables then I have to make sure my script gets updated.
Ill keep hunting around.. appreciate any help.
-Rich
Do you have a sample of copying a record from 1 table to another
including all fields and setting some fields?
As I suggested in comments, use the Import Records[] script step, and select the same file as the source. If you choose Arrange by: [ matching names ] in the Import Field Mapping dialog, it will automatically map all source fields to their similarly named counterparts.
Note that you must establish a found set in the source table before importing.
For "setting some fields", you can define auto-enter options and activate them during the import, or run Replace Field Contents[] immediately after the import.

Kentico Import Toolkit 8.1

I am currently using the Kentico Import Toolkit to create documents in the tree.
At this point, I have imported around 100 documents using the toolkit, and they are all located at the correct place in the tree. Now the issue/concern that I had was, as I have imported these documents, my spreadsheet has been updated, so extra fields and data were added, so how do I go about importing this extra data into the currently existing documents? Also just bear in mind I don't want other fields or data to be affected by this, as some of the documents were updated with some other content by the content editors using CMS Desk, which isn't available in the spreadsheet.
Import toolkit is not the right tool to achieve this task. Even if you select "Import new and overwrite existing pages" it'll overwrite most of your columns. Actually it only preserves system and id columns from the existing documents - all other columns get overwritten.
Either you can write a piece of custom code or you can try following:
Open SSMS and navigate to the coupled table of your page type (something like CONTENT_MyDocType). This is where your custom columns are stored.
Right click -> Edit top 200 rows
Click "Show SQL Pane"
Adjust the columns, ORDER BY and WHERE clause to match your excel file, re-run the query
Select desired rows in your excel file and copy them to clipboard
Paste the data in the SSMS
rocky is right, Import Toolkit is meant for importing complete objects, not partial/continuous update.
You could map the fields that you know are not changed in the spreadsheet to a SQL query selecting the value from the target database.
To achieve this, just insert #<target> at the beginning of the SQL select statement you will be mapping the field to.
It will be rather laborious though and it also requires certain knowledge about the nature of the spreadsheet changes.

Is it possible to create table templates in Filemaker?

I'm using Filemaker Pro 12 and I was wondering if there is a way of creating a template for tables. There are a number of fields I'm placing in my tables that are identical utility-fields like modification time-stamp, active/inactive flags, etc. I was hoping there was a way that I could define the skeleton of each table somehow instead of having to manually add these identical fields every time.
If you are using the Advanced version, you can copy&paste fields among tables/files.
Using the regular version, you can import records from your "default" table and specify [New Table...] as the target table. This will recreate the source table's structure in the target file. The source table does not have to contain any records for this to work.
To expand a little bit on michael-hor257k's answer, if you're using FileMaker Pro Advanced, a good practice is to create a "Default" table that has your core utility fields. When you want to make a new table in Manage Database, instead:
Highlight the Default table,
Copy & Paste the table, then
Rename the new table.

FileMaker - How to create a new record in another table for each record in a found set

I want a user to be able to do a search in the layout/table "Cages" and then click a button to run a script that will create a new record in a table called "CagesProtocolLineHistory" for each record in the found set.
Below is what I have so far which almost works but on go to original layout line it doesn't go to the next record, it goes to a record near the end. I.e., it's skipping some records.
Yes, Go to Record[ first ] before the loop will ensure that all records are copied. Otherwise, if the script starts at some record other than first, it will skip all records before this.
A couple of notes: FileMaker string comparison is case-insensitive by default, so you don't need to use Upper() here. Also, in most cases it's simpler not to copy all data through variables, but pass a single key and copy other data via lookups.
You could do this in a three step: search for the "yes"s in a new window; export the record ids (passed params) to a local temp file, and reimport the ids into the child table.

How can I (partially) automate the transfer of a FileMaker database structure and field contents to a second database?

I'm trying to copy some field values to a duplicate database. One record at a time. This is used for history and so I can delete some records in the original database to keep it fast.
I don't want to manually save the values in a variable because there are hundreds of fields. So I want to go to the first field, save the field name and value and then go over to the other database and save the data. Then run a 'Go to Next Field' and loop through all the fields.
This works perfectly, but here is the problem: When a field is a calculation you cannot tab into it and therefore 'Go to Next Field' doesn't work. It skips it.
I though of doing a 'Go to Object' but then I need to name all the objects and I can't find a script to name objects.
Can anyone out there think of a solution?
Thanks!
This is one of those problems where I always found it easier to do an export/import.
Export all the data you want from the one database, and then import it into the other database. All you need to do is:
Manually specify which fields you want to copy
Map the data from the export to the right fields in the new database/table
You can even write a script to do these things for you.
There are several ways to achieve this.
To make a "history file", I have found there are several cases out there, so lets take a look.
CASE ONE
Single file I just want to "keep" a very large file with historical data, because I need to erease all data in my Main file.
In this case, you should create a "clone" table (in the same file ore in other file, is the same). Then change any calculation field to the type of the calculation result (number, text, date, an so on...). Remove any "auto entered value or calculation from any field, like auto number, auto creation date, etc..). You will have a "Plain Table" with no calculations or auto entered data.
Then add a field to control duplicate data. If you have lets say an invoice number (unique) for each record, you can do this to achieve this task. But if you do not have a unique field that identifies the record as unique, then you have to create one...
To create such a field, I recommed to add a new field on the clone table and set as an aunto entered calculation and make a field combination that is unique... somthing like this: invoiceNumber & "-" & lineNumber & "-" " & date.
On the clone table make shure that validation is set up for "always", and no empty values allowed and that this value is unique.
Once you setup the clone table... then you can import your records, making sure that the auto enty option is on. Yo can do it as many times as you like, new records will be added and no duplicates.
If you want, can make a Script to do the move to historical table all the current records before deleting them.
NOTE:
This technique works fine when the data you try to keep do not have changes over time. This means, once the record is created is has no changes.
CASE TWO
A historical table must be created but some fields are updated.
In the beginnig I thougth a historical data, never changes. In some cases I found this is not the case, like the case I want to track historical invoices but at the same time, keep track if they are paid or not...
In this case you may use the same technique above, but instead of importing data... you must update data based on the "unique" fields that identifiy the record.
Hope this technique helps
FileMaker's FieldNames() function, along with GetField() can give you a list of field names and then their values