We'd like to allow our users to make changes to our ERP (only very limited conditions) from within the smartsheet. Is there a way to add a web service call based on changes to a cell?
It's certainly possible to do what you've described (provided, of course, that your ERP supports inbound web service calls to update data there).
If you're wanting to do this programmatically (i.e., by writing a script in Python or any other supported language), you can use the Smartsheet API to create one or more webhooks that will monitor Smartsheet for the type of changes you specify, and send your integration notifications when those events occur. Your integration would then listen for those inbound event notifications, and when they are received, it would programmatically issue the appropriate web service call(s) to update data in the ERP.
It's also worth mentioning that, depending on what ERP you're using, it may be possible for you to accomplish your goal without having to write any code. If you're interested in exploring the feasibility of that approach, I'd suggest you check out products like Zapier, Power Automate, etc. to see if they offer a connector for Smartsheet and your ERP. You may also want to check out the Smartsheet Connectors and Integrations page to see if there's a connector for your ERP listed there.
Related
I've been trying to implement a way to download all the changes made by a particular user in salesforce using PowerShell script & create a package The changes could be anything whether it can be added or modified, Apex classes, profiles, Account, etc based on the modified by the user, component ID, timestamp, etc. below is the URL that exposes the API. The URL Does not explain any way to do this by using a script.
https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/meta_listmetadata.htm
Does anyone know how I can implement this?
Regards,
Kramer
Salesforce orgs other than scratch orgs do not currently provide source tracking, which makes it possible to pinpoint user changes in metadata and extract only those changes. This is done by an SFDX/Metadata API client, like Salesforce DX or CumulusCI (disclaimer: I'm on the CumulusCI team).
I would not try to implement a Metadata API client in PowerShell; instead, harness one of the existing tools to do so.
Salesforce orgs other than scratch orgs don't provide source tracking at present. To identify user changes, you can either
Attempt to extract all metadata and diff it against your version control, which is considerably harder than it sounds and is implemented by a variety of commercial DevOps tools for Salesforce (GearSet, Copado, etc).
Have the user manually add components to a Change Set or Unmanaged Package, and use a Metadata API client as above to retrieve the contents of that package. (Little-known fact, a Change Set can be retrieved as a package!)
To emphasize: DevOps on Salesforce does not work like other platforms. Working on the Metadata API requires a fair amount of time investment and specialization. Harness the existing work of the Salesforce community where you can, but be aware that the task you are laying out may be rather more involved than you think and it's not necessarily something you can just throw together from off-the-shelf components.
I need to connect Salesforce to an external database we have, and constantly keep both the database and salesforce updated in as close to real time as we can get. I have tired Google searching possible solutions, but nearly all of them have been outdated by over a year. Any ideas?
Thank You!
Depending on your exact scenario it is quite difficult to give you a proper answer.
However off the top of my head I would suggest two Salesforce products.
Salesforce Connect
https://www.salesforce.com/products/platform/products/salesforce-connect/
Salesforce Connect allows you to connect to various data sources and turn the tables / objects of that data source into a SObject. For example MySQL, Microsoft SQL Server, Oracle etc. There are limitations and thus it would be better to talk to a Certified Architect about such an implementation.
Heroku Connect
https://www.heroku.com/connect
Heroku Connect allows you to connect a Heroku data source with a Salesforce Object. The sync is not immediate but there are quite a few customisations inside the product to make the sync as "live" as possible. There are limitations and thus it would be better to talk to a Certified Architect about such an implementation.
Salesforce Connect has limitations.. It's good for presenting data via the interface, but if you need to act on the data and report on the data it might not be the best bet.
For close to real time hand coded sync, look at the streaming API, or using Salesforce Platform Events.
If you want to use an ETL tool, my organization has had decent luck with DBAmp, which is a Sql add on product and fairly inexpensive as compared to a lot of ETL tools ($1625 annually.) http://www.forceamp.com/ We're able to replicate the entire SF database offline in SQL with DBAMP, push changes to the offline Sql copy and upsert changes. It's also a good backup solution via offline full data copy. We got very good support from them as well when we encountered challenges.
Hope this helps.
Not sure if you are syncing one object or multiple objects but there are a few options that you have.
You can try the salesforce provided features Salesforce Connect which allows you to view and update data from your external source In salesforce but there are limitations with reporting and other considerations you should consider.
If you make use of Heroku, Heroku Connect is your best bet
You can also use a middleware ESB solutions like MuleSoft which can orchestrate keeping data in sync across multiple data sources and do batch loads, but depending on how often changes you want to keep an eye out for api limits for inbound calls to salesforce.
You can roll your own solution where you can use Outbound Messages in workflow (or triggers that initiates an apex class that calls out, but that is more cumbersome and you have to do custom error handling and retry logic which you get for free using outbound messages) to send changes from salesforce to your homegrown service that writes to you database and have you homegrown solution write back to salesforce using the soap or rest api. That would probably take you some time to build. You would also still need to be aware of API limits depending on how many updates are made on the non salesforce side.
You crate a Canvas App which displays data from your DB in Salesforce as a Tab and hook it up via SSO so users are auto logged in. But again there would not be reporting, or any salesforce features that you can take advantage of.
But I really think that you should spend some time to determine what system is your source of truth because that would determine how the data should be synced. You should also investigate if you really need the sync to be realtime or near realtime, or if you can manage with something like an hourly true up on the system that is not the source of truth.
I am very confused about the correct or recommended mechanism to use for accessing google fusion tables APIs in app scripts. There seem to be two methods with examples but no discussion about which is preferred or why. Is one of these interfaces newer and preferred while the other is dying? Is one obsolete or more restricted in what it can do?
Method 1 is the REST API described here
https://developers.google.com/fusiontables/docs/v2/sql-reference#Select
Method 2 is a set of library functions sort of described here under the Apps Script/Google Advanced Services:
https://developers.google.com/apps-script/advanced/fusion-tables
For example, using the REST api to do a dql query, we end up with something like this:
function runSQL(sql){
var getDataURL = 'https://www.googleapis.com/fusiontables/v1/query?sql='+sql;
var dataResponse = UrlFetchApp.fetch(getDataURL,getUrlFetchOptions()).getContentText();
return dataResponse;
}
And using the advanced API we use something like this:
result = FusionTables.Query.sql(sql, { hdrs: false });
The REST API seems much harder to use, requireing complex oAuth and developer keys to be configured in advance and coded into the application while the Advanced Services API harvests all this behind the scenes and makes for simple API calls like I show here.
I have seen numerous examples using each of the above with no hint as to why one author chose her mechanism instead of the other.
Your help is greatly appreciated.
The service within app-script is a work in progress, so the full functionality of the API might not be fully supported at the moment. As you mentioned though, the big advantage of the service over the REST API is that you do not have to handle the OAuth flow, as you only need to enable it on your script (as stated here).
The Apps Script "advanced service" implementation still lacks some advanced functionality (like alt=media format queries or multipart / resumable uploads) -- if it actually has those features, it lacks extremely basic documentation of them, to the point that the Apps Script editor autocomplete is unaware of them. The tradeoff of these functionality gaps is that you don't need to handle keys, request building, etc.
So, if you're doing simple sql select / importRows work, the Advanced Service should be able to cover almost all your needs. If you need to delete from your FusionTables, you might want to consider setting up the REST API - because deleting is 1 record per query, the better way to delete is to instead "download what you want to keep, then re-upload it back via replaceRows."
(This worked for me for a while, but eventually what I was keeping outgrew the Apps Script service's limitations and I began receiving Empty Response errors from the call to replaceRows. My remedy was to perform my record maintenance tasks via the REST API, where I can specify resumable uploads, timeouts, etc., while more "normal" interactions are done through the Advanced Service.)
We are using Integration Manager to create a batch of monthly invoices. I want to build a replacement that creates a batch in GP and imports the invoices into the batch. After review, the batch will be posted to GP. Is this doable with either of these API's and which would you choose?
Integration manager can use econnect for its insertion engine. If you are processing a high volume of transactions, you will notice a huge difference between integration manager's UI engine and econnect. When you create a new integration, simply choose the econnect option and whatever data source you have set up.
Concerning the non-IM APIs, both may be used, and they are situational. The web services sits on top of econnect, and it is much slower integrating because you are passing information between several layers. It does provide a secure link between your SQL server and any outside integration sources, and it is ideal if you need to setup something to allow integrations to happen through middleware such as a billing gateway. If you have access to build an econnect process/app that makes a connection to your SQL server for GP, this is the fastest way to integrate SOP and receivable transactions. It maintains all the business rules to help ensure GP does not break in as a result of a patch, and the speed is fast enough to push thousands of records without requiring a custom integration solution.
If you want to get done quickly, and do not mind working from the integration manager interface, just build your integrations using econnect. If you have the time to develop a custom integration routine, go for econnect. If you want to leverage WCF technology on top of econnect, go for web services.
Each are listed in the amount of time it will take you to develop from fast implementation to slower implementation.
I'm building a web application that will have access to PeopleSoft's database via jdbc.
Is it possible that I can use PeopleSoft's id/password for my custom application, so users accessing my website will not have to have another username/password?
Peoplesoft stores user details in the table PSOPRDEFN.
You will be able to verify the username against: PSOPRDEFN.OPRID.
The password field is: OPERPSWD.
Unfortunately the encryption function used for this field: hash() is available only from within peoplecode.
If you want to use a single sign on you should be able to do so by customizing the USERMAINT.gbl component perhaps in the saveprechange peoplecode, to save the password in a second field of your choice with an encryption algorithm that you can implement from JDBC as well.
If you want to reuse PeopleSoft security, you'd need to connect at a higher level than JDBC straight into the database. You could look at a component interface (codeable in Java) or send a SOAP message into PeopleSofts Integration Gateway - both methods would authenticate you against peopleSoft using its own security mechanisms.
The old way was to customize psuser.c to your needs and recompile as a new dll, used it your program, assuming you're on a Microsoft platform. As mentioned above, you could have a peoplesoft developer create a component interface ( or use the one that is delivered ). You can export wrapper Java or C/C++ code from a CI, a template. This code can then be used in an external program to call the CI. one way or the other, you have to interface with peopletools to call their decrypt for passwords.
Depending on how dynamic your business is, whether you add lots of employees each day, you could export psoprdefn using app messaging to another database. On the send, you could encrypt passwords however you like. But as you can surmise, this would not be real-time.
One thing I remember doing long ago was have a peoplesoft tech person develop a page the sole functionality of which was to call my java class and which obtained user/pswds as needed. Once I had them, I was good to go.
You can use the psjoa.jar , in that way you can signon via app.server using the same users and passwords in the psoprdefn table.
PeopleSoft has an LDAP integration ability but it has to be configured. If you are accessing via a Java wrapper around a component interface, a special account can be set up in PeopleSoft with access only to the underlying component, but the login/password would have to be passed into the component interface. This can be encrypted or sent over https.
PeopleSoft also has what it calls "row level" security - the ability to partition data sets so that for example your code could only access employee data within a specific business unit or accounting info for a particular line of business. This is all controlled within the PeopleSoft online security application.