How do I tell Filemaker what table to paste a value into based on the value? Is there a way to somehow have Filemaker paste a value into a table without hard coding the table name?
Using imported transaction data, I determine which ledger (table) the transaction should be posted to. But I can't seem to get the script to then post it into the right table based on the value.
Screenshot of Script
Thanks for the thoughtful suggestions!
I found a way to do what I was thinking, #AndreasT your suggestion helped.
Case (
Imported_Transactions::Debit_To = "Expenses_All"; "Expenses_All::Description";
Imported_Transactions::Debit_To = "Liability_CrCard_BofA"; Liability_CrCard_BofA::Description;
Imported_Transactions::Debit_To = "Liability_CrCard_CitiBusiness"; Liability_CrCard_CitiBusiness::Description;
)
Using the Case function, I was able to get Filemaker to put the data in the right table. It took some doing, but by putting the tablename::field in quotes, it worked.
Ultimately, though, I found it easier to just use one table and field descriptors to store my data. Simplicity makes it easier to produce reports.
You could use the conditional If script step to determine what table to insert data into based on your input.
Set field by name[ tablename::fieldname ; value ]
I would love to have a stored procedure which returns a selection which is editable in phpadmin. For example, something as simple as
SELECT * FROM students
written from the console will allow you to edit the records as far as each row is unique (table with a primary key). However, that same statement within a procedure won't allow it!
The point is that I have a selection that I will be using quite often and it's a bit more complex than the one above. If I copy it and paste it on the console, it's fine, but not so when run from a procedure!
I would appreciate any ideas to work around this limitation.
Thank you in advance!
Thanks to those who read my question. I just figured out that a very simple way to solve this issue is to create a table with my output selection. Then, from the console, I can select and edit whatever I need from that temporary table.
Thanks for your time!
I've been tasked with finding a way to migrate data into a DB2 AS400 database. When the data is entered (currently manually) on the front end, the system is doing some calculations and inserting the results in a table.
My understanding is that it's using a trigger to do so. I don't know very much about this stuff, but I have written code to directly insert values into that same table. Is there a way for me to figure out what trigger is being fired when users enter data manually?
I've looked in QSYS2/SYSTRIGGERS and besides not making much sense to me, I see no triggers that belong to the SCHEMA with my table in it.
Any help here would be awesome, as I am stuck.
SELECT *
FROM QSYS2.SYSTRIGGERS
WHERE TABSCHEMA = 'MYSCHEMA'
AND TABNAME = 'MYTABLE'
Should work fine.
If you'd prefer to use a 5250 command line, the Display File Description (DSPFD) command will show you the triggers on a file (table)
DSPFD FILE(MYSCHMA/MYTABLE) TYPE(*TRG)
Lastly, trigger information is available via the IBM i Navigator GUI. Either the older fat client version or the newer web based one.
So simply the problem occurs when I want to edit selected rows and then apply it. I'm sure it worked some time ago. Tried redownload postgres driver in preferences(yeah, I use postgres) Anyone faced same issue? Anyone succeed?
PS. Running on 142.4861.1.
I found read only checkbox in connection preferences, it was not set, toggling didn't help, upgrading, reseting also didn't help.
In my case in version 2020.1 of DataGrip (SQL was run from the opened file, on unique table, so that select just worked as expected, but when I was trying to edit - error appeared: unresolved table reference): specifying schemes in request helped. So that SELECT * FROM users; was changed to SELECT * FROM schemadb.users; and that helped. Probably there is a bug. I've tried all the methods mentioned above.
Try synchronize database connection. It helped me in mysql connection.
What actually helped was toggling Auto-commit checkbox in console, after that everything runs flawlessly.
The only thing that actually worked for me, after trying everything above multiple times, was deleting each DB connection and remaking a new one from scratch.
This can be due to default settings, make it sure your your transaction Mode settings are as below
I had the same issue with datagrop 2020.2. I have followed all the method but for me, just delete the connection and create new connection (never trying to duplicate) just manual. It is works!
Set and clear Read-only in Data Source Properties helps me.
What worked for me was removing a field alias - going from this:
SELECT
l.MSKU Item_SKU,
l.Supplier,
l.ASIN,
l.title,
l.Buy_Price
FROM listings l
WHERE l.Buy_Price IS NULL
ORDER BY l.Supplier, l.listingID desc;
to this:
SELECT
l.MSKU,
l.Supplier,
l.ASIN,
l.title,
l.Buy_Price
FROM listings l
WHERE l.Buy_Price IS NULL
ORDER BY l.Supplier, l.listingID desc;
is all it took for me to be able to edit the results of the query
If your query is using field alias (renaming column instead of using actual column name) , then Datagrip set data results as read only.
Solution:
Rewrite query using field names as in the table and rerun query. Then you will be able to edit the rows.
Example
Rewrite this:
select id,interest_recalcualated_on, interest_recalculation_enabled alias from m_loan;
..to this:
select id,interest_recalcualated_on, interest_recalculation_enabled from m_loan;
Nothing worked. I had to update DataGrip from 2017.2 to 2018.3
I had to open the project by navigating to: /home/user/.DataGrip2017.2/config/projects/my_project/
All my scripts for this project as I did not want to import the config from the old version of Datagrip. So I'll probably need to downgrade, get the scripts and upgrade again.
I struck the same issue, not Postgres but MySql, PhpStorm 2019.1 when I had two schema available on the same db connection and my query: select * from users where full_name like '%handy%'; resulted in a result table that couldn't be edited even though the console reported it was querying the stage schema. A more specific query: select * from stage.users where full_name like '%handy%'; using the exact table name led to a results table that could be inline edited.
The only way I was able to get around this issue was to remove the database connection details for the given connection and recreate them from scratch. While this was happening to me on this specific connection, editing was working fine on other connections, which suggests the problem might be related to specific parameters around the connection in question.
For me, when I changed back my Select query to * (wild char to select all the columns), from specific columns, the edit and save started to working again. This is strange! May be worth reporting a bug to intellij. I'm using DataGrip 2020.1 on macOs Catalina 10.15.3
I thought I had the same problem but just realized I needed to submit changes (with a button or command + enter) for them to be applied.
This answer is 4 years late though. Maybe the bug was already fixed haha, but this may help someone else.
The problem for me was the obvious (well obvious after identifying it) - using a MySQL keyword as a field name (yes I know its not a good idea but thats how it is sometimes). So this caused the error
select id,name,value from mytable;
The correct way to write this of course is using backticks so:
select id,name,`value` from mytable;
and this solved it for me.
seems like DataGrip doesn't allow to change data in the ui table if you select specific columns with joins.
so if you select table1.column1, table2.column2 from <table1> inner join <table2> and try to change the value in column1, you will receive This table is read only warning
only if you do select * from <table>, you can edit the cell values
try this
In the IDE settings (Ctrl+Alt+S), go to Database | General.
Select the Open results in new tab checkbox and click OK.
Re-run your query, and you'll be able to edit and commit the changes in the new tabs.
I have a (very simple and standard) UPDATE statement which works fine either directly in Query Analyser, or executed as a stored procedure in Query Analyser.
UPDATE A
SET
A.field1 = B.col1
, A.field2 = B.col2
FROM
tblA AS A INNER JOIN tblB AS B
ON A.pk1 = B.pk1 AND A.pk2 = B.pk2
Problem is when i execute the same stored proc via microsoft ADP (by double-clicking on the sproc name or using the Run option), it says "query ran successfully but did not return records" AND does NOT update the records when i inspect the tables directly.
Before anyone even says "syntax of MS-Access is different than SQLServer T-SQL", remember that with ADP everything happens on the server and one is actually passing thru to T-SQL.
Any bright ideas from any ADP gurus out there?
Gotcha. Responding to my own question for the benefit of anyone else.
Tools / Options / Advanced / Client-Server Settings / Default max records is set at 10,000 (presumably this is the default). Change this to 0 for unlimited.
My table had 100,000+ rows and whatever set of 10,000 it was updating was difficult to find ( among a sea of 90,000+ un-updated rows ). Hence the update did not work fully as expected.
Try and see whether the query gets executed on the SQL Server using SQL profiler.
Also, I think you might need to close the linked table & re-open it to see the updated records.
Does that work?
Run the query with SQL PRofiler running. Before you start the trace add in all the error events. This will give you any errors that the SQL Server is generating that the Access ADP might not be showing correctly (or at all).
Feel free to post them here.
Just as a reference, here's a paper I wrote on Update Queries that discusses some of the issues associated with when the fail.
http://www.fmsinc.com/microsoftaccess/query/snytax/update-query.html
I seem to remember that I always got the "didn't return any rows" message and had to simply turn off the messaging. It's because it isn't returning any rows!
as for the other - sometimes there's a primary key issue. Does the table being updated have a primary key in SQLServer? If so, check the view of the table in Access - sometimes that link doesn't come through. It's been a while, so I could be wrong, but I think you may need to look at the design view of the table while in access and add the primary key there.
EDIT: Additional thought:
in your debugging, try throwing in print statements to see what the values of your inputs are. Is it actually picking up the data from the table as you expect when you execute from access?