GTRR in first table or GTRR in alternative Table - filemaker

I have two tables, one for current amchines and one for achived machines. I have a page with a button that goes to related record and points to the machine table, if I move that record into the achive table the button no longer works.
Can I create a script which basically tries to go to related roecord in machines tables and nothing there go to related record in the archive table?

There are several ways to do what you want, including checking for related records before you go to one:
If Count(current machines::index) > 0
Go to Related Record (from current machines)
Else If Count(archived machines::index) > 0
Go to Related Record (from archived machines)
Else
Error Message
End If
or just going to a related record and seeing if anything was found:
Set Error Capture On // don't show an error message to the user
Go to Related Record (from current machines)
If Get(FoundCount) = 0
Go to Related Record (from archived machines)
End If
Set Error Capture Off

Related

Does Single Record Buffering work if I do not use the full key?

I have a standard table in ABAP, buffering is activated, type Single Records. Yet every SELECT goes to the DB.
The table has fields [MANDT, A, B], but the SELECTs use only MANDT and A. Could this be the reason why the buffer is bypassed?
As the documentation for table buffering says, "Single Record" buffering only works when you provide the full primary key.
When this is a table which is under your control and there are no other applications which might prefer a different buffering strategy, then you can change its buffering type to "Generic Area" and enter the number of key fields you want to be part of the buffering key (2, in this case). When the table is very small, you might even consider to enable "Full Buffering", which will load the whole table into the application server's memory as soon as a single row is requested.
Alternatively, when all your accesses to the table happen within the same session, then you could consider to build your own in-memory buffering on the ABAP level. If you need your own buffering across sessions, you could build a buffer using a shared memory class. But this will of course become problematic when there are write-operations to the table which are beyond your control or write-operations on other application servers.
TLDR:
Does Single Record Buffering work if I do not use the full key?
No
Explanation follows:
How can you "KNOW" what sap is doing. Ok the docu is normally very accurate assuming you check the correction version of the docu.
7.5.4 docu says
When using single record buffering, any ABAP SQL statements must
respect the full primary key to prevent them from bypassing table
buffering.
In otherwords, if a table "full key buffered", Any attempt to read without the full key goes to the DB.
There is also another TRAP in older releases where projections ignore the buffer.
Run a Test trace on Select fieldA from Table1 where full key. To verifiy the buffer is used. (explained below)
How to test what is actually happening.
Find a time when few are using the App server...
Have a window open with your test code. (Use Eclipse :) )
In another window have st05 open.
Reset the table buffer ( you are in dev system right! )
/$TAB into the command field.
IN ST05 start a trace for SQL and buffer
Run your test program.
which access records using different techniques. Or repeats an access to test the buffering. eg
DATA: ls_t001 TYPE t001.
select single * from t001 into ls_t001
where bukrs = '0001'.
select single * from t001 into ls_t001
where bukrs = '0001'.
select single butxt from t001 into ls_t001
where bukrs = '0001'.
Stop trace and Display / Analyze trace
You can filter the trace output to be show only the table in question
The Blue lines are Buffer accesses.
The yellow lines are SQL db accesses.
He you can see how the single record access resulted in 57 records being read/ loaded.
IE full or generic table buffering. Loaded all for my client.
CAREFUL: DONT BE tempted to use SE16 as your test tool.
It has a gotcha for people testing Buffering.
SE16 uses the select BYPASSING BUFFER statement
To complete this topic a look at AL12 is useful.
You can now "look at" the various buffer contents.
Dont forget other clients.... when analyzing

Getting ID fields from the primary table into the linked table via Form

As an amateur coder for some years I have generally used sub forms when dealing with linked tables to make the transfer of ID field from primary to sub nice and simple...
However in my latest project the main form is a continuous form with a list of delivery runs (Date, RunName, RunCompleted) etc... Linked to this primary table is a delivery list containing (SKU of product, Qty etc...). I use a simple Relationship between the two tables.
Now, On the main (RUNS) form at the end of each row is a button that opens the DELIVERIES form and displays all records with matching RUNID
This is fine for displaying pre-existing data but when I want to add new records I have been using the following code attached to the OnCurrent event:
Me.RunID = DLookup("[RunID]", "tbl_BCCRuns", "RunID = " & Forms![frm_BCC_Runs_list]![RunID])
I have also used:
Forms![frm_BCC_Deliveries].Controls![RunID] = Forms![tbl_BCCRuns].Controls![RunID]
(Note: above done from memory and exact code may be incorrect but that's not the problem at hand)
Now... Both these options give me what I need however...
I find that as I am working on the database, or if you open certain forms in the right order (a bug I need to identify and fix clearly) you can open the DELIVERIES form without the filter (to view all deliveries for arguments sake) and the top entry (usually the oldest record) suddenly adopts the RUNID of the selected record back in the main form.
Now, my question is this, and the answer may be a simple "no" and that's fine, I'll move on...
Is there a better way, a way I am not familiar with or just don't know about due to my inconsistent Access progress, to transfer ID's to a form without risking contamination from improper use? Or do I just have to bite the bullet and make sure that there is just no possible way for that to happen?
In effort to alleviate the issue, I have created a Display Only form for viewing the deliveries but there are still times when I need to access the live historical data to modify other fields without wanting to modify the RUNID.
Any pointers greatly appreciated...
Since you only want to pull the RunID if the form is on a new record row, do a check to verify this is a new record.
If Me.NewRecord Then
Me.RunID = DLookup("[RunID]", "tbl_BCCRuns", "RunID = " & Forms![frm_BCC_Runs_list]![RunID])
End If
Could also consider a technique to synchronize parent and child forms when both are subforms on a main form (the main form does not have to be bound) https://www.fmsinc.com/MicrosoftAccess/Forms/Synchronize/LinkedSubforms.asp

message when performing Find (script step) in FileMaker

In OS/X I have a Customer table with the fields Name, TaxCode, and Address defined with the "required" attribute (other fields not). In a layout without any of these required fields I execute a script with a Perform Find[Restore] step which looks for records containing value 1 in another field of the same table.
I have traced the script execution and when the script reaches the Perform Find step the following Dialog pops:
"Name" is defined to require a value. Allow this field to remain empty?
Revert Record No Yes
If I click Yes then comes an identical message for each of the remaining "required" fields in the table. Eventually the script finishes as expected, but of course this manual intervention makes the query cumbersome and unacceptable.
What is happening? and what can I do?
In your layout, there is a record with open state that doesn't meet your requirements. When you try to enter find mode, it fails to commit.
Try to use Get ( RecordOpenState ) to see what you get in the layout before you enter find mode – 0 is committed, while 1 and 2 are records not committed.
Also, make sure you do not have a "new record/request" script step before the "enter find mode"; it could be the reason of your trouble.
I checked the contents of the Customer table and found a register with all the fields empty. Once this record was deleted, the problem disappeared.
Cannot remember/explain how the empty record came about, probably some trouble during the creation process. What still surprises me is the circumstances in which Filemaker verifies the correctness of fields NOT involved in the actual Find.
Thanks for the suggestion.

Lotus Notes application Document count and disk space

Using Lotus Notes 8.5.2 & made a backup of my mail application in order to preserve everything in a specific folder before deleting the contents of it from my main application. The backup is a local copy, created by going to File --> Application --> New Copy. Set the Server to Local, give it a title & file name that I am saving in a folder. All of this works okay.
Once I have that, I go into the All Documents & delete everything out except the contents of the folder(s) I want this application to preserve. When finished, I can select all and see approximately 800 documents.
However, there are a couple other things I have noticed also. First - the Document Count (Right-click on the newly created application & go to Properties). Select the "i" tab, and it has a Disk Space & Document count there. However, that document count doesn't match what is shown when you open the application & go to All Documents. That count is matches the 800 I had after deleting all but the contents I wanted to preserve. Instead, the application says it has almost double that amount (1500+), with a fairly large file size.
I know about the Unread Document count, and in this particular application I checked the "Don't maintain unread marks" on the last property tab. There is no red number in the application, but the document count nor the file size changed when that was selected. Compacting the application makes no difference.
I'm concerned that although I've trimmed down what I want to preserve on this Lotus Notes application that there's a lot of excess baggage with it. Also, since the document count appears to be inflated, I suspect that the file size is also.
How do you make a backup copy of a Lotus Notes application, then keep only what you want & have the Document Count and File Size reflect what you have actually preserved? Would appreciate any help or advice.
Thanks!
This question might really belong on ServerFault or SuperUser, because it's more of an admin or user question than a development question, but I can give you an answer from a developer angle...
Open your mailbox in Domino Designer, and look at the selection formula for $All view. It should look something like this:
SELECT #IsNotMember("A"; ExcludeFromView) & IsMailStationery != 1 & Form != "Group" & Form != "Person"
That should tell you first of all that indeed, "All Documents" doesn't really mean all documents. If you take a closer look, you'll see that three types of documents are not included in All Documents.
Stationery documents
Person and Group documents (i.e., synchronized contacts)
Any other docs that any IBM, 3rd party, or local developer has decided to mark with an "A" in the ExcludeFromView field. (I think that repeat calendar appointment info probably falls into this category.)
One or more of those things is accounting for the difference in your document count.
If you want, you can create a view with the inverse of that selection formula by reversing each comparison and changing the Ands to Ors:
SELECT #IsMember("A"; ExcludeFromView) | (IsMailStationery = 1) | ( Form = "Group" | Form = "Person")
Or for that matter, you can get the same result just taking the original formula and surrounding it with parens and prefixing it with a logical not.
Either way, that view should show you everything that's not in AllDocuments, and you can delete anything there that you don't want.
For a procedure that doesn't involve mucking around with Domino Designer, I would suggest making a local replica instead of a local copy, and using the selective replication option to replicate only documents from specific folders (Space Savers under More Options). But that answer belongs on ServerFault or SuperUser so if you have any questions about it please enter a new question there.

How to determine which SSAS Cube is processing now?

There is a problem when several users can process the same cube simultaniously and as a result processing of cube fails. So I need to check if certain cube is processing at current moment.
I dont think you can prevent a cube from being processed if someone else is already processing it. What you can do to "help" is run a MDX query to check the last time the cube was processed:
SELECT CUBE_NAME, LAST_DATA_UPDATE FROM $System.MDSCHEMA_CUBES
or check the sys.process table on the realted sql server instance to see if it is running:
select spid, ecid, blocked, cmd, loginame, db_name(dbid) Db, nt_username, net_library, hostname, physical_io,
login_time, last_batch, cpu, status, open_tran, program_name
from master.dbo.sysprocesses
where spid > 50
and loginame <> 'sa'
and program_name like '%Analysis%'
order by physical_io desc
go
use this code to select running processes: (execute this in OLAP)
select *
from $system.discover_Sessions
where session_Status = 1
And this code to cancel running prossesess ! Please change PID to running SESSISONS_SPID
like in example:
<Cancel xmlns ="http://schemas.microsoft.com/analysisservices/2003/engine">
<SPID>92436</SPID>
<CancelAssociated>1</CancelAssociated>
</Cancel<
Probably a better approach to the ones already listed would be to use SQL Server Profiler to watch activity on the Analysis Server. As stated already, the current popular answer has two flaws, the first option only shows the LAST time the cube had been processed. And the second option only shows if something is running. But it doesn't tell you what is running and what if your cube was not processing from a SQL server but a different data source?
Utilizing SQL Server Profiler will tell you not only if something is processing but also details of what is processing. Most of the Events you can filter out. Watch Progress Report Current events if you want real time information... It's usually too much of a fire-hose of data to get real info out of it, but you'll know well that at least a process is going on. Watch Progress Report Begin and End events only to get better information like what is currently being processed, even down to the partition levels. Other events with good information include Command Begin/End and Query Begin/End.
You will see a job running in Task Manager called "MSDARCH" if a cube is processing. Not sure how you can tell which one though.