I'm trying to fetch all events from entire system by using REST API to synchronize with own application. I was extracting events for every user using REST API for his own calendar file.
For example:
Fetch johndoe.nsf/api/calendar/events
Fetch jasonmartin.nsf/api/calendar/events
Fetch jeanmoore.nsf/api/calendar/events
etc.
It's working with low number of users. But I need to do it for around 2,5k users, which kills my system.
Is there any central database from I can extract this data?
I tried this with resource reservation databese, but only what I got was empty response.
No, there is no central database of calendar events. There couldn't be. Notes and Domino is a distributed environment. Information can be spread over dozens of servers.
But you could write Java or C application that runs on the Domino server and aggregates the information from all the users' calendars into one central database, and that application will probably run faster than your remote calls through the REST API. But you'll still have to make REST API calls into that central database, and the sum of the activity will be greater than what you are dealing with now.
Maybe my iCal freeware tool could help you
http://abdata.ch/publish-ibm-domino-calendar-entries-in-icalendar-format/
Related
We are having java application which consumes salesforce partner.wsdl. We login to salesforce instance, then we get metadata for all the objects and we cache it. As salesforce objects become more we are seeing more time in getting metadata and cache it for first call.
What is the best way we can reduce this time, even if more objects are introduced in salesforce?
Is there any soap api call I can make to get metadata only for the object and its dependencies?
do we need to use only describeSobject to get these information.?
Cache the SF responses, flush the cache once a day, not with every request?
Look into REST API, either as complete replacement or just to take advantage of the "if-modified-since". This header works also per object.
Experiment with queries on EntityDefinition table to learn names of only the objects you're interested with (you probably don't care about apex classes, custom settings, *share and *history tables..). For example https://stackoverflow.com/a/64276053/313628
Then yes - describe just them, using REST or SOAP's describeSobject. If you have many objects - the network roundtrips might be an issue, you'd need to debug the app to see where it spends most time. Combat it by requesting up to 100 objects at a time, maybe issuing multiple requests (async processing? threads?) and combining results later.
Does it have to be partner WSDL? You could "preload" objects in your app using enterprise wsdl and combine some techniques listed above.
I am asking for advice on possibly better solutions for the part of the project I'm working on. I'll first give some background and then my current thoughts.
Background
Our clients can use my company's products to generate potentially large data sets for use in their industry. When the data sets are generated, the clients will file a processing request to us.
We want to send the clients a summary email which contains some statistical charts as well as sampling points from the data sets so they can do some initial quality control work. If the data sets are of bad quality, they don't need to file any request.
One problem is that the charts and sampling points can be potentially too large to be sent in an email. The charts and the sampling points we want to include in the emails are pictures. Although we can use low-quality format such as JPEG to save space, we cannot control how many data sets would be included in the summary email, so the total size could still exceed the normal email size limit.
In terms of technologies, we are mainly developing in Python on Ubuntu 14.04.
Goals of the Solution
In general, we want to present a report-like thing to the clients to do some initial QA. The report may contains external links but does not need to be very interactive. In other words, a static report should be fine.
We want to reduce the steps or things that our clients must do to read the report. For example, if the report can be just an email, the user only needs to 1). log in and 2). open the email. If they use a client software, they may skip 1). and just open and begin to read.
We also want to minimize the burden of maintaining extra user accounts for both us and our clients. For example, if the solution requires us to register a new user account, this solution is, although still acceptable, not ranked very high.
Security is important because our clients don't want their reports to be read by unauthorized third parties.
We want the process automated. We want the solution to provide programming interface so that we can automate the report sending/sharing process.
Performance is NOT a critical issue. Our user base is not large. I think at most in hundreds. They also don't generate data that frequently, at most once a week. We don't need real-time response. Even a delay of a few hours is still acceptable.
My Current Thoughts of Solution
Possible solution #1: In-house web service. I can set up a server machine and develop our own web service. We put the report into our database and the clients can then query via the Internet.
Possible solution #2: Amazon Web Service. AWS is quite mature but I'm not sure if they could be expensive because so far we just wanna share a report with our remote clients which doesn't look like a big deal to use AWS.
Possible solution #3: Google Drive. I know Google Drive provides API to do uploading and sharing programmatically, but I think we need to register a dedicated Google account to use that.
Any better solutions??
You could possibly use AWS S3 and Cloudfront. Files can easily be loaded into S3 using the AWS SDK's and API. You can then use the API to generate secure links to the files that can only be opened for a specific time and optionally from a specific IP.
Files on S3 can also be automatically cleaned up after a specific time if needed using lifecycle rules.
Storage and transfer prices are fairly cheap with AWS and remember that the S3 storage cost indicated is by the month so if you only have an object loaded for a few days then you only pay for a few days.
S3: http://aws.amazon.com/s3/pricing
Cloudfront: https://aws.amazon.com/cloudfront/pricing/
Here's a list of the SDK's for AWS:
https://aws.amazon.com/tools/#sdk
Or you can use their command line tools for Windows batch or powershell scripting:
https://aws.amazon.com/tools/#cli
Here's some info on how the private content urls are created:
http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/PrivateContent.html
I will suggest to built this service using mix of your #1 and #2 options. You can do the processing and for transferring the data leverage AWS S3 which is quiet cheap.
Example: 100GB costs like approx $3.
Also AWS S3 will be beneficial as you are covered for any disaster on your local environment your data will be safe in S3.
For security you can leverage data encryption and signed URLS in AWS S3.
We have a range of web applications here that allow users to download selected data from a number of databases and online services. Mainly Environmental information. We can track users visiting web pages using tools like Piwik or Google Analytics. We also want to track the amount of resource or data that they use, possibly also applying limits to record downloads.
If this was a single DB system we could track rows delivered within the db. However here we have a SOA with a range of sources and sinks. What I envisage is a service that can be messaged by other systems to register or track the amount of a resource used.
e.g User Andrew was sent 125MB of water quality data.
The central data metering service tracks usage messages from a variety of sources, produces reports and where appropriate applies caps or billing limits.
This service might be expanded to include processing as well as data download.
I would consider this to be a not unusual requirement but I can't find much in the way of existing software for it - perhaps because I am not using the correct terminology.
SO my questions:
What would you call this service - what keywords will lead me to existing systems?
What solutions already exist in this area - in particular FOSS or cloud based systems?
Could something like Google Analytics be persuaded to operate in this fashion?
It would be possible to do with the measurement protocol from Google Universal Analytics in conjunction with the user id feature in Analytics and one or more custom dimensions.
The measurement protocol is a language agnostic vaguely REST-like (inasfar as you send a bunch of parameters to an endpoint) protocol to send tracking data to the Google servers.
User id is a feature to recognize authenticated users across devices and multiple visits.
If the various parts of your setup send http calls build to the measurement protocol and include the user id to recognize the user and a value for a custom dimension for the file size (or rather a custom metric if you want to have sums and averages) and maybe a custom dimension for the file name you can send this to you Analytics account and build a custom report for downloads.
Note that the user id is an internal id that is used to link together visits by the same user from multiple devices - it is not something that shows up in the reports that would allow you to report on individual users in the Analytics interface (if you want that you need to include another id as custom dimension, and you have to check with the Google TOS what kind of id is allowed). Plus you'd need a dedicated data view in GA for sessions with a user id which will not show unauthenicated users.
I would like to have my iOS application to check whether or not its server requests are going through, in order to register the number of users in a database based on which version of the API they're using.
I would then use these statistics to choose an appropriate time to roll out later versions of the application, and notify the users of the new release.
Would it be possible to get an example of the code required to perform these kinds of analytics?
I will build a dashboard system for my apps, where a page will have several widgets that draw charts, tables and glyphs representing potentially unrelated data.
The client will be HTML5 and I can push for only modern web browser.
My big problem is what backend use for this. I want to store "tables" for use in the charts and in real-time update the widgets.
For example, a invoicing widget will show how much $$ have been collected today. In the "table" will have a row for each total of the invoice:
inv = 1; total = 50
Total: 50
and the widget will draw that. When new data is pushed:
inv = 2; total = 100
Total: 150
The widget will show in realtime the total to the end-user.
The data is private for the user company. Eventually I will need to purge too old data (ie: I only need to keep as much data is necessary to proper evaluation of the info need for the end-user. For example, only keep 1 month of invoicing totals).
I'm thinking in use something like http://www.firebase.com/ or http://pusher.com/ but I suspect only solve the "notify in realtime" part of the equation. As far as I understand, they not let me get past data (ie: If the data is update in the weekend and the user open his dashboard to see what happened)
Then I see http://derbyjs.com/ and the possibility to use mongodb.
I wonder which backend/platform will bring me closer to the build of this system. I have experience with python/django/.net/postgress but could accept the use of something else if solve best this kind of app behavior.
Firebase offers both the "notify in relatime" part that you mention, as well as persistent data storage. Take a look at the tutorial, which walks you through building a real-time persisted chat app (the past chat messages are stored in Firebase and are sent back to the client every time you reload). And you can do much more complicated stuff like the real-time charts / widgets that you mention as well.
The big limitation with Firebase right now is that we're in closed beta and the data is currently unprotected (anybody can read and write your data). The security features are coming soon though.
Some other backend platforms you may want to evaluate are: Meteor and Simperium. Firebase and Simperium are cloud services where your data is stored in the cloud and you don't have to manage any servers of your own, while Meteor and DerbyJS are platforms that you have to install and run on your own server.
I would recommend signalR. It's amazing and you can literally do anything with it. Check it out: www.signalr.net and if you have any problems simply go to www.jabbr.net You will find a very helpful community there. I implemented a notification mechanism similar to facebook together with real time monitoring and a small chat in the same web site.