Updating PostgreSQL database from changes in Filemaker - postgresql

I am able to import records into Filemaker using actualtech trial drives, but do not know how to update PostgreSQL database when making changes in Filemaker.
Ideally I would like Filemaker to remain as a front-end and PostgreSQL be the backend in the event they would like to move away from Filemaker.
Is there a real-time dynamic updating with ODBC connections?

Some troubleshooting and biting the bullet... but I bought Open Source Driver from Actual Tech and it seems to be working.
Not very intuitive but I learned I can import records from postgreSQL using the "Open Source Driver" and I can then setup a relationship with the "ESS Driver" (File / Manage Database / Relationships ) to use postgreSQL as a backend; meaning updating real-time.
I would like to make a tutorial in the future because it was a few hours of trial and error.

Related

Realtime sync between Oracle db(source) and Mongodb(destination)

Is it possible to have a real time sync between a heavy Oracle database and mongodb? Has any one tried this?
I saw a site - Keep MongoDB and Oracle in sync
Here they have mentioned having triggers on the oracle tables, my doubt is that will this slow the applications already running on the Oracle database. Will this replication cause the applications to slow down or bring down the oracle database's performance?
The right solution would involve Change Data Capture from Oracle. This does not require triggers on oracle and thus won't effect performance. There are several tools you can use such as Striim and Attunity. Striim supports change data capture from Oracle and writing to MongoDB.
https://striim.com
https://attunity.com

Load a PostgreSQL database using cloudconnect

On the side of my Gooddata project, I maintain a small PostgreSQL database that contains a few tables.
I would like to be able to integrate both my ETL processes using the same tool, and it seems to me cloudconnect would be the easiest way, since I already have my whole GoodData ETL in it.
Here are the ways I tried to do it without success:
I tried to have a look in the documentation, and it seems to me that all the functionalities of CloverETL that enabled this (DBOutput, PostGreSQLDataWriter) are not available in Cloudconnect.
I managed to connect to the Agile Datawarehouse Service (Database attached to GoodData), but it seems that only the ADS database is able to understand the request:
COPY MyDataBaseTable (field1,field2) FROM LOCAL '${DATA_TMP_DIR}/CIforADS.csv'
even when I adapt the syntax to PostgreSQL because the dynamic addressing I use here does not seem to work.
Is there any way to proceed that I'm missing? Can anyone think of a workaround?
In general this could be achieved by using of "DBExecute" component, but
I'm not sure if I understand it well - do you want to load data into your own Postgres instance using CloudConnect?

MongoDB - Importing a db from local machine to MongoLabs

I have a decent-sized database on my local machine that has a lot of important data that cannot be re-made easily (locally-tested user profile informatino, blog posts, that sorta thing). It's around 50mb in size.
I'm getting close to making my app live, and I want to bring this database to MongoLabs. I know how to connect to MongoLabs and set up a new database there, but I can't work out (if it's even possible) how to import a database from my local machine to MongoLabs, nor can I find any documentation discussing this.
Questions are:
Is this possible to do
How do I do it?
If you open your database at mongolab.com and go to the Tools tab, you should see some helpful commands for migrating your data to your new database.
This support article also has more details:
https://support.mongolab.com/entries/20164381

Postgresql Manager

I have been working with Microsoft SQL Server since 6.5 along with other database like Oracle, MySQL and SQLite. I equally appreciate or hate all these DBMS for some point or the other.
On our forthcoming project, we are considering Postgres in the back-end. I have already started playing with it, pretty interesting for me.
I have always heard good comments on Postgres database, but I don't like the admin studio at all. While creating new a table, I hate the way of creating columns on pgAdmin by having to click add button again and again.
Are there any "Studios" for Postgres database that provide
more organized table creation process (spreadsheet like)
graphical view designer
What's wrong with plain SQL? Writing plain SQL goes much faster than click-wait-click-wait-type-click-wait-ok-wait. You could use any tool for this, pgAdmin as well.
Open Office Base can also connect with PostgreSQL, works like MS Access. And talking about MS Access, MS Access can also connect to PostgreSQL to create tables, views, etc.
See this official list of Administration/Development tools.
EMS SQL Manager seems to be a good option.
for the graphical view of SQL queries
maybe SQLeo can Help

Synchronisation between SQL Server 2008 Express and VFP tables

I'm looking for advices and suggestions on how to synchronise data between two databases.
The first database is a SQL Server 2008 Express that run on disconnected laptops (no network or internet access). The second database (main) is a VFP 9.0 that run on a server.
When the user connect their laptop on the network, I want the synchronisation process to go through.
Other than the different database engines, I have the following items to take into account:
The tables don't necessary have the same structure
The primary keys are not the same (GUID in the SQL Server and often a combination of character fields in VFP)
Synchronisation of the tables must be done in a certain order to respect the parent-child relationships
On some insert on the SQL Server side, a new primary key must be generated and synchronised in the VFP table
A bunch of validations must be made and some feedback from the user are sometimes needed
Not all records need to be synchronised
Some records on the SQL Server need to be deleted after the syncronisation
Need to take into account deleted records from both side
Minimal modifications need to be done on the VFP database
There are probably other points I'm forgotting now, but I think you get the idea of the challenge I face. My guess right now are that I will need to build a custom synchronisation module, but I want your input before I go on in case I overlooked some options and to get some tips on how to approach this.
I looked rapidly at Microsoft Sync Framework, but with all the restrictions I have and the fact that there is no VFP client already built (AFAIK), I don't think it will be of great help.
Thanks in advance for your feedback.
Update: The laptop application is a C# WinForm application and is using SQL Server 2008 Express.
The complexity of the situation and requirements leads me to believe you need to write a Visual FoxPro application. Visual FoxPro connects to SQL Server 2008 data easily. The complexity of the code is matching the requirements and identifying the data that needs to be synched, not the syntax. Visual FoxPro strength is in the data manipulation language and the ability to connect to almost any data source (native DBFs, ODBC, ADO, and XML).
SQL Server can read VFP 9 data via the VFP 9 OLE DB driver. You could write T-SQL stored procedures to get to the VFP data. Not sure how it would recognize the laptop being connected to the network though.
Another approach is to use SQL Server XML Diffgrams. I am not an expert by any stretch of the imagination on this approach, but it would be something you can research.
Since my expertise is with Visual FoxPro I would find it way easier to go the other way though, but that is just me. You have to go with the skillset of the resources you have for the project.
VFP reads and writes SQL Server data via a connection (DSN, ConnectionString) and any technique involving SQL Passthrough (SQLConnect(), SQLExec() and SQLDisconnect()), CursorAdapters, Remote Views, or a combination of the three.
A Visual FoxPro program can also recognize Windows Events like connecting to a network. The application could be installed on each laptop and running to recognize the Windows Event. Once the event is raised the application can attempt to connect to the SQL Server database (possible it is connecting to a network without the SQL Server available or a different network).
Once connected it runs the logic to check and synchronize the databases.
Sounds like you don't have a lot of control over the application writing to the VFP 9 data on the laptop. If you do have control over the application writing to the VFP 9 database you might consider changing the app to write to a SQL Server Express instance on the laptop and then you can use SQL Server replication to manage the synchronization. Not a trivial task though and SQL Server replication, while getting better with each release, does cause hair loss in DBAs. Definitely a lot of work going this route.
Rick Schummer
Visual FoxPro MVP
I would encourage you to take another look at MS sync framework. We have a situation where we want to synchronize occasionally connected C# clients apps with our Java/Oracle backend. You can use the sync framework providers for the C# client and implement your own custom subclass of KnowledgeSyncProvider for the backend. This will get you half-way there, and show you a good pattern to apply for the rest.