I host my database on Azure. I would like to search data on the table in that database. I am trying to use B4I and the tech help their said I need to use REST API's. I am pretty sure I need to use ODATA. I have the auth token but I am not sure if this is even possible.
In order to query Azure SQL with an API you need to add a layer between it and the destination. As mentioned in this question, OData is a specification that can be implemented fairly easily as there are plenty of libraries that will take care of the bulk of the code for you.
As far as where to host the API, you have several options within Azure. The most common being App Services, Azure Functions, and Logic Apps.
Related
I have a working app I made using Amplify (with AppSync API and Cognito). I'd like to make another app which is different, but shares some data with my first project (same product, but different targets, usages and security rules).
Is there a clean way to use Amplify for that new project, telling the GraphQL API to fetch some data in the DynamoDB from my first Amplify project ?
This data will change often and will be heavy - so I'm not too much into any synchronization solution.
I thought about those solutions, but I'm not experienced enough to tell if one of them is good:
Not using Amplify but SAM for this new project (but I will lose all the build pipeline provided by Amplify)
Using Amplify for the Hosting and the Auth, but configure AppSync with SAM and plug it into my existing DataSource
Maybe CloudFormation can be the answer, but I don't see how to interact directly with it within Amplify
There is an article from Amazon about microservice architectures that might be useful for you
https://aws.amazon.com/blogs/mobile/appsync-microservices/
I'm not sure if Amplify supports adding multiple APIs to it, if so, you could add it and then pull only that api on it.
IMO the easiest approach would be just creating a query endpoint to fetch the data from your other datasource, using lambda.
So, in that case you would edit your schema to something like this
query
{
externalData [ExternalData] #function(name: "getExternalData")
}
Then you need to add the lambda function getExternalData, which will be responsible to query the data as needed.
The article above has more in-depth details about this architecture
I have a web application that has to be linked with a graph database (Neo4J). Is it possible to read or write data to Neo4J using Appery?
I have chosen Appery because I am a beginner when it comes to databases and Appery seems to be easy in using REST API, as well as there is a free trial.
Feedback would be highly helpful. Thanks in advance.
Edit: I am aware that Neo4J uses Cypher queries. I would like to know if Appery supports Cypher as well.
Side note: The reason I am asking the question here without trying it out is because I dont have an active DB and my application is private due to my company's security policy
You can do that as long as Neo4J database has a REST API. If it does, then you can make calls to it from an Appery app (from Server Code or API Express). Hope this helps.
I need to connect Salesforce to an external database we have, and constantly keep both the database and salesforce updated in as close to real time as we can get. I have tired Google searching possible solutions, but nearly all of them have been outdated by over a year. Any ideas?
Thank You!
Depending on your exact scenario it is quite difficult to give you a proper answer.
However off the top of my head I would suggest two Salesforce products.
Salesforce Connect
https://www.salesforce.com/products/platform/products/salesforce-connect/
Salesforce Connect allows you to connect to various data sources and turn the tables / objects of that data source into a SObject. For example MySQL, Microsoft SQL Server, Oracle etc. There are limitations and thus it would be better to talk to a Certified Architect about such an implementation.
Heroku Connect
https://www.heroku.com/connect
Heroku Connect allows you to connect a Heroku data source with a Salesforce Object. The sync is not immediate but there are quite a few customisations inside the product to make the sync as "live" as possible. There are limitations and thus it would be better to talk to a Certified Architect about such an implementation.
Salesforce Connect has limitations.. It's good for presenting data via the interface, but if you need to act on the data and report on the data it might not be the best bet.
For close to real time hand coded sync, look at the streaming API, or using Salesforce Platform Events.
If you want to use an ETL tool, my organization has had decent luck with DBAmp, which is a Sql add on product and fairly inexpensive as compared to a lot of ETL tools ($1625 annually.) http://www.forceamp.com/ We're able to replicate the entire SF database offline in SQL with DBAMP, push changes to the offline Sql copy and upsert changes. It's also a good backup solution via offline full data copy. We got very good support from them as well when we encountered challenges.
Hope this helps.
Not sure if you are syncing one object or multiple objects but there are a few options that you have.
You can try the salesforce provided features Salesforce Connect which allows you to view and update data from your external source In salesforce but there are limitations with reporting and other considerations you should consider.
If you make use of Heroku, Heroku Connect is your best bet
You can also use a middleware ESB solutions like MuleSoft which can orchestrate keeping data in sync across multiple data sources and do batch loads, but depending on how often changes you want to keep an eye out for api limits for inbound calls to salesforce.
You can roll your own solution where you can use Outbound Messages in workflow (or triggers that initiates an apex class that calls out, but that is more cumbersome and you have to do custom error handling and retry logic which you get for free using outbound messages) to send changes from salesforce to your homegrown service that writes to you database and have you homegrown solution write back to salesforce using the soap or rest api. That would probably take you some time to build. You would also still need to be aware of API limits depending on how many updates are made on the non salesforce side.
You crate a Canvas App which displays data from your DB in Salesforce as a Tab and hook it up via SSO so users are auto logged in. But again there would not be reporting, or any salesforce features that you can take advantage of.
But I really think that you should spend some time to determine what system is your source of truth because that would determine how the data should be synced. You should also investigate if you really need the sync to be realtime or near realtime, or if you can manage with something like an hourly true up on the system that is not the source of truth.
I have some tables from Postgres database to be integrated into Salesforce as external objects. I went through some video tutorials and documentations where I was recommended to use Salesforce Connect which supports providers with "OData" protocol support. Is it possible to integrate Postgres tables into Salesforce as external objects without Salesforce Connect?
Thanks.
Be careful with the phrase "external objects". To me, the use of those particular words implies the specific implementation of external data access/federation delivered with Salesforce Connect. I don't believe that there is any alternative if your goal is to create "real" external objects (named "objectname__x") within Salesforce.
There are, though, Salesforce integration solutions from the likes of Progress, Jitterbit, Mulesoft, and Informatica and others that can be used to access PostgreSQL, with varying degrees of coding being required. You won't get "external objects", but you will be able to access data residing off-cloud in a PostgreSQL database from your Salesforce system.
Hope this helps.
Currently the way to integrate data from external storages (Postgres in your case) without Salesforce Connect is implement your custom logic for synchronization using REST or SOAP API, Apex classes and triggers, Salesforce Workflows and Flows. Also you will need to implement appropriate interfaces on side of your data storage. Complexity of all these steps depends on complexity of your existing data model and infrastructure around it.
From the MongoLab's documentation, they recommend:
MongoLab databases can be accessed by your application code in two ways.
The first method - the one we strongly recommend - is to connect using one of the MongoDB drivers (as described above). You do not need
to use our API if you use the driver.
The second method, which you should use only if you cannot connect via one of the MongoDB drivers, is via MongoLab’s RESTful data API.
Why do they recommend using the driver rather than their REST API? One reason I can think of is portability across different MongoDB providers. Are there any other reasons? Wouldn't it be more beneficial for MongoLab to "vendor lock-in" customers with their API?
The points that #WiredPrairie and #Stennie brought up around security are correct. *When you use our REST API, you expose your API key to the client. Currently, anyone with the API key can modify your database. As a result, we only recommend using the REST API with public data, e.g. all the locations for taco trucks in the country.
By writing your own app tier, you can keep credentials to your database from being exposed to the client.
If you have any more questions, email us at support#mongolab.com. Happy to help!
-Chris#MongoLab
p.s. thanks #WiredPrairie and #Stennie