it's been 2 months I'm working for a new client on BI Reporting on Oracle BI Publisher within an OracleCloud CRM and HCM. to make query to OracleCloud database I have to use notepad++ and then copy/past my query in the Query textArea on BIPublisher (which is a simple text area without any syntaxe check or color), I was wondering if there are some experts who might know a way to connect SQL Developer to OracleCloud database so I can run my queries directly without copy/past on the browser.
thank you a lot
There's no direct access to the underlying Oracle Database from your SaaS subscription.
We (the SQL Developer team) are working with a few of the SaaS business owners to make SQL Developer Web available for their subscribers. This would allow you to run queries directly against your database w/o having to do the copy/paste jump you're doing today in BI Publisher.
I cannot provide guidance on when this will happen or even if your particular services will make it available.
The TL;DR answer to your question is 'No, but we are working on it'
Related
I need to connect Salesforce to an external database we have, and constantly keep both the database and salesforce updated in as close to real time as we can get. I have tired Google searching possible solutions, but nearly all of them have been outdated by over a year. Any ideas?
Thank You!
Depending on your exact scenario it is quite difficult to give you a proper answer.
However off the top of my head I would suggest two Salesforce products.
Salesforce Connect
https://www.salesforce.com/products/platform/products/salesforce-connect/
Salesforce Connect allows you to connect to various data sources and turn the tables / objects of that data source into a SObject. For example MySQL, Microsoft SQL Server, Oracle etc. There are limitations and thus it would be better to talk to a Certified Architect about such an implementation.
Heroku Connect
https://www.heroku.com/connect
Heroku Connect allows you to connect a Heroku data source with a Salesforce Object. The sync is not immediate but there are quite a few customisations inside the product to make the sync as "live" as possible. There are limitations and thus it would be better to talk to a Certified Architect about such an implementation.
Salesforce Connect has limitations.. It's good for presenting data via the interface, but if you need to act on the data and report on the data it might not be the best bet.
For close to real time hand coded sync, look at the streaming API, or using Salesforce Platform Events.
If you want to use an ETL tool, my organization has had decent luck with DBAmp, which is a Sql add on product and fairly inexpensive as compared to a lot of ETL tools ($1625 annually.) http://www.forceamp.com/ We're able to replicate the entire SF database offline in SQL with DBAMP, push changes to the offline Sql copy and upsert changes. It's also a good backup solution via offline full data copy. We got very good support from them as well when we encountered challenges.
Hope this helps.
Not sure if you are syncing one object or multiple objects but there are a few options that you have.
You can try the salesforce provided features Salesforce Connect which allows you to view and update data from your external source In salesforce but there are limitations with reporting and other considerations you should consider.
If you make use of Heroku, Heroku Connect is your best bet
You can also use a middleware ESB solutions like MuleSoft which can orchestrate keeping data in sync across multiple data sources and do batch loads, but depending on how often changes you want to keep an eye out for api limits for inbound calls to salesforce.
You can roll your own solution where you can use Outbound Messages in workflow (or triggers that initiates an apex class that calls out, but that is more cumbersome and you have to do custom error handling and retry logic which you get for free using outbound messages) to send changes from salesforce to your homegrown service that writes to you database and have you homegrown solution write back to salesforce using the soap or rest api. That would probably take you some time to build. You would also still need to be aware of API limits depending on how many updates are made on the non salesforce side.
You crate a Canvas App which displays data from your DB in Salesforce as a Tab and hook it up via SSO so users are auto logged in. But again there would not be reporting, or any salesforce features that you can take advantage of.
But I really think that you should spend some time to determine what system is your source of truth because that would determine how the data should be synced. You should also investigate if you really need the sync to be realtime or near realtime, or if you can manage with something like an hourly true up on the system that is not the source of truth.
A guy made an oracle based web forms and setup oracle database in my computer 3 years before.
Now I want to make my database online.
So that I could access via my two computers.
I do not even know basic thing about oracle.
If any one could tell me how to do this?
someone told me I have to do it via live IP, dedicated server or VPS?
Could any one can give me idea from where should I start?
Should I buy a dedicated server or VPS or any other way to do this.
Ideally, I'd like to use Azure table storage as the provider, but SQL Azure will also work. Anything I've dug up so far is over a year old, using deprecated approaches. I.e., outdated code samples, SDKs and IDEs.
As the title states, this would be applied to an MVC2 app running in Azure. Examples, code, links, etc. do not necessarily have to be for MVC. Anything related to a .Net 4.0 web app using Forms Authentication on Azure will do.
Microsoft originally released a set of sample providers with the PDC08 SDK - but these definitely are not recommended for commercial use.
Recently this project has produced some new ones - http://azureproviders.codeplex.com/ - I'd recommend going with that one as it is "live code" - you might also be able to contribute something back to it.
If you do use these providers, please be aware that Azure charges per transaction - at a base rate of $0.01 per 10000 transactions - and that the logic within these providers can cause "quite a few" transactions to occur. So if your site is busy and has a lot of membership activity, then it could work out quite expensive to operate.
If you are using SQL Azure membership, then the membership SQL is standard - http://support.microsoft.com/kb/2006191 - the only differences in the ASP SQL scripts is in the Session storage (since Session uses SQL Agent to clear sessions - and SQL agent is not supported on SQL Azure)
Personally, I've use the Table storage for test/demo sites - but for anything "real" I've moved towards SQL Azure - it's easier to query, to run reports, to backup, etc
Unfortunately, unless you role your own provider, the only sample I have seen is the outdated one you mentioned. For user authentication (RoleProvider), it is not too bad (i.e. no bugs I have heard about). However, for Session state, it has some issues. I don't think it does any sort of encryption however, so the passwords might be in plaintext. Worst case scenario, you could at least use it as starting point for your own.
A quick look around and I can't even find the 'Additional Samples' anymore. They might have been lost when Code Gallery did an update awhile back. I know it is still used in http://phluffyfotos.codeplex.com, so you could pull it from the source there at least.
I would not use ATS Forms authentication, because of transaction cost associated, if your site is going to have alot of authentication requests (even token authorization requires check against ATS)
I would use Forms Authentication against SQL Azure with standard SqlMembershipProvider
It works just fine. I've manually migrated necessary aspnet tables & stored procs over to SQL Azure from a local SQL server instance without problems. Just update the aspnet_schemaversions table to have this content:
common 1 1
membership 1 1
personalization 1 1
profile 1 1
role manager 1 1
Im new to Microsoft entity framework and wonders if it's possible to use this framework with a DB2 iSeries AS/400? Are there any problems at all when working with this kind of "legacy systems"? and the EF framework?
You can use Entity Framework to connect to an ISeries DB2 database one of three ways:
1. If you purchase the license to IBM's DB2 Connect product. The license is around $12,000 which is outrageous. Also, there is not enough good documentation for how the DB2 Connect product actually works, how it installs, or its possible benefits. I've contacted one of their resellers to get a test install and it was not an intuitive process so we never purchased the product. Likewise, there doesn't appear to be any demos. I don't understand how IBM can have one of the best servers available but don't bend over backwards to leverage their product to Microsoft developers.
That said, if you research this topic you will find much information saying that you can use Entity Framework using their ADO.Net data server provider. I went down this path but I will warn you that their data server provider only works when DB2 connect is installed. This is confusing because IBM advertises this ability but doesn't often show the direct correlation between the two products so you're often left thinking that it will work without db2 connect which it will not at this time.
2. Purchase a third party data provider designed for Entity Framework. Progress software's Data Direct is super easy to use. I don't like how their server licensing works though because their server license is per core processor for your app server. This is flawed because this assumes you only have one app server when in reality, people regularly need to have different app servers for load balancing. I would prefer they just license per one ISeries server. However, you can purchase the licenses for each individual user if you decide to do that.
3. Write your own data provider. This is possible but there are obvious drawbacks.
It sort of looks like it is - see here:
http://publib.boulder.ibm.com/infocenter/db2luw/v9r5/topic/com.ibm.swg.im.dbclient.adonet.doc/doc/c0054118.html?resultof=%22%65%6e%74%69%74%79%22%20%22%65%6e%74%69%74%69%22%20%22%66%72%61%6d%65%77%6f%72%6b%22%20
but quite honestly, I'm not 100% sure - especially not if it supports EF 4.0 (yet).
Or check out the IBM DB2Connect site and search from there...
I'm looking for advices and suggestions on how to synchronise data between two databases.
The first database is a SQL Server 2008 Express that run on disconnected laptops (no network or internet access). The second database (main) is a VFP 9.0 that run on a server.
When the user connect their laptop on the network, I want the synchronisation process to go through.
Other than the different database engines, I have the following items to take into account:
The tables don't necessary have the same structure
The primary keys are not the same (GUID in the SQL Server and often a combination of character fields in VFP)
Synchronisation of the tables must be done in a certain order to respect the parent-child relationships
On some insert on the SQL Server side, a new primary key must be generated and synchronised in the VFP table
A bunch of validations must be made and some feedback from the user are sometimes needed
Not all records need to be synchronised
Some records on the SQL Server need to be deleted after the syncronisation
Need to take into account deleted records from both side
Minimal modifications need to be done on the VFP database
There are probably other points I'm forgotting now, but I think you get the idea of the challenge I face. My guess right now are that I will need to build a custom synchronisation module, but I want your input before I go on in case I overlooked some options and to get some tips on how to approach this.
I looked rapidly at Microsoft Sync Framework, but with all the restrictions I have and the fact that there is no VFP client already built (AFAIK), I don't think it will be of great help.
Thanks in advance for your feedback.
Update: The laptop application is a C# WinForm application and is using SQL Server 2008 Express.
The complexity of the situation and requirements leads me to believe you need to write a Visual FoxPro application. Visual FoxPro connects to SQL Server 2008 data easily. The complexity of the code is matching the requirements and identifying the data that needs to be synched, not the syntax. Visual FoxPro strength is in the data manipulation language and the ability to connect to almost any data source (native DBFs, ODBC, ADO, and XML).
SQL Server can read VFP 9 data via the VFP 9 OLE DB driver. You could write T-SQL stored procedures to get to the VFP data. Not sure how it would recognize the laptop being connected to the network though.
Another approach is to use SQL Server XML Diffgrams. I am not an expert by any stretch of the imagination on this approach, but it would be something you can research.
Since my expertise is with Visual FoxPro I would find it way easier to go the other way though, but that is just me. You have to go with the skillset of the resources you have for the project.
VFP reads and writes SQL Server data via a connection (DSN, ConnectionString) and any technique involving SQL Passthrough (SQLConnect(), SQLExec() and SQLDisconnect()), CursorAdapters, Remote Views, or a combination of the three.
A Visual FoxPro program can also recognize Windows Events like connecting to a network. The application could be installed on each laptop and running to recognize the Windows Event. Once the event is raised the application can attempt to connect to the SQL Server database (possible it is connecting to a network without the SQL Server available or a different network).
Once connected it runs the logic to check and synchronize the databases.
Sounds like you don't have a lot of control over the application writing to the VFP 9 data on the laptop. If you do have control over the application writing to the VFP 9 database you might consider changing the app to write to a SQL Server Express instance on the laptop and then you can use SQL Server replication to manage the synchronization. Not a trivial task though and SQL Server replication, while getting better with each release, does cause hair loss in DBAs. Definitely a lot of work going this route.
Rick Schummer
Visual FoxPro MVP
I would encourage you to take another look at MS sync framework. We have a situation where we want to synchronize occasionally connected C# clients apps with our Java/Oracle backend. You can use the sync framework providers for the C# client and implement your own custom subclass of KnowledgeSyncProvider for the backend. This will get you half-way there, and show you a good pattern to apply for the rest.