Best way to use API in app that has limited use? [closed] - swift

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
My situation is that I am building an app (in swift) that pulls data from an API & displays them in a tableview (say for example up to 500 cells). The problem occurs with the API. It is limited to 200 calls/day and 6k/month, and one request is equal to 100 pieces of data, so to display 500 cells it would cost 5 call credits.
I am stuck on how to efficiently use this API. Currently, each time the user refreshes the tableview it will cost 5 credits. Therefore after this has been done 40 times, the API cap has been reached for the day.
The only solution I have though of is to have some script in js/ruby/python that pulls the data every x minutes or x hours and saves this to Firebase databse or firebase Cloud storage and then in my app I can pull the data from Firebase?
My other idea was to run the script on a server and pull the data from there.
Is there any other simpler alternatives that I am missing?

To prevent over consuming why not you run the API and save the results to your own DB; create a custom API specific for your app to pull from your personal storage and this way you can control the interval and frequency of how often you pull on the premium API.
You can setup a job to auto update your personal DB with the premium data every x amount of time, update new entries and add new ones as you see fit while on the client side they will pull the same premium data you’ve pulled; imo that would be how I would go about because without control you’ll find yourself facing a major scaling issue.

Related

How to sync a mobile app offline state with a remote database? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I am building a mobile app using Flutter. All user's data is stored online in a MySQL database, so the app needs an internet connection for almost every user interaction (there is a backend REST API).
Users have to be able to create some lists of tasks, update and delete every task and list, and so on. But from the user's perspective, the need for an internet connection for every simple operation like adding or deleting a task is a bad experience. I need a way to support these operations even without connection with the backend and to apply these changes later when it is possible. But what is the best practice to handle this case?
How to keep the app behaving like normal even without an internet connection and sync all changes that the user has done with the backend when the internet is available again?
For example, if the user creates a new list the app expects to receive the new list's object (with id) from the backend. Later this id is used for every backend call about this list like adding a task in it.
What you can do is use a state management approaches like
Providers, Bloc etc and have a local state of your database or the needed list inside them and apply all the changes on them when offline and implement all these on to the server when connected to internet.
Read here about flutter state Management
also you can check when the device is connected to internet with this connectivity and data_connection_checker packages

what is the best way to get the maximum tps server can support [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I have 20 Rest API's build using Jersey and Apache Client.I want to know the max TPS my server can withstand, using JMeter tool.what is the best way to achieve such kind of performance scenario goal.
First of all build a Test Plan. I believe it should have at least 20 HTTP Request samplers to cover all your endpoints and a HTTP Header Manager to send correct Content-Type header. See Testing SOAP/REST Web Services Using JMeter article for details.
Once you have the Test Plan - run it with 1-2 virtual users to check that it does what it supposed to be doing. Inspect requests and responses details using View Results Tree listener. Modify requests if needed.
Configure your Thread Group(s) so load is increased gradually, i.e. provide reasonable Ramp-Up time
Once you're happy with your test behaviour disable the View Results Tree listener and run your test in non-GUI mode
Analyze the results using i.e. HTML Reporting Dashboard. The value which interests you lives in Hits Per Second graph

Best database for a Statistics System [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I need to build a Statistics System but I don't know if MongoDB would be the best solution. The system needs to track couple of things and than display the information. For example of a similar thing - a site, and every user that first visits the site adds a row with information about him. The system needs to store the data as fast as possible, and, for example, it creats a chart of the growth of users viewing the page using Google Chrome. Also, if a user visits again, a field in the users's already row is updated (say a field called "Days").
The system needs to handle 200,000 new visits a day (new records), 20,000,000 users visits again (updates) a day, and 800,000,000 DB records. It needs also to output the data fast - for example, creating a chart of how much users visits each day from England, using Google Chrome, etc.
So what would be the best DB to handle this data? Would MongoDB handle this fine?
Thanks!
Mongodb allows atomic updates and scales very well. That's exactly what it's designed for. But keep in mind two things: beware the disk space, it may run out very quickly and if you need quick stats (like region coverage, traffic sources, etc.), you have to precompute them. The fastest way is to build a simple daemon for this that would keep all numbers in memory and save it hourly/daily.
Redis is a very good choice for it, provided you have a lot of RAM, or a strategy to shard the data over multiple nodes. it's good because:
it is in memory, so you can do real time analytics (I think bit.ly's real time stats use it). in fact, it was originally created for that.
it is very very fast, can do hundreds of thousands of updates a seconds with ease.
it has atomic operations.
it has sorted sets which are great for time series.
RDM Workgroup is a database management system for desktop and server environments and allows in-memory speed as well.
You can also use its persistence feature; where you manage data in-memory and then transfer that data on-disk when the application shuts down so there is no data loss.
It is based on the network model with an intuitive interface so its scalability is top-notch and will be able to handle the large load of new visitors that you will be expecting.

What are useful parameters to store when tracking page views? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I want to implement a simple in-house table that tracks user page views on my website. Without targeting some specific hypothesis, what is useful data to store? Eventually I'll use it to build graphs or decision trees to better learn about our user base. This is static (no javascript).
Things I can think of:
URL accessed
HTTP refer[r]er
HTTP Accept Language
Browser-agent
Session id
User id (if logged in)
Time visited
It depends on how public your site is. If your site requires authentication you can have more controlled statistics because you can trace the user (visitors) history. In the case the user does not require authentication you are limited to the information provided by the SERVER VARIABLES: HTTP_USER_AGENT; REMOTE_USER; REMOTE_ADDR; REMOTE_HOST; REMOTE_PORT; HTTP_COOKIE; HTTP_USER_AGENT.
I have implemented something like this for some non-public site each time the user logs on to the site, the information I'm storing looks like:
User Key
Remote host IP
Date Logon
Last Request Datetime
Total time connected (minutes)
Last Request Minutes
Event/Action performed
Sounds like a good start,
I'd be inclined to store visitor IP address, and derived from that via a geo ip lookup the location of the visitor.
Also you could consider reverse dns'ing the IP to get an idea of the isp you're user is on, you might never use it but then again it could be useful if you have a report of downstream caching causing problems.

almost live forex currency rates [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I need to get live forex exchange rates for my personal application. I know that there's no free service that has this data available for download. I've been using Yahoo Finance, but I've just found out that it has a delay of 15 minutes or so. Is there any way I could get fresher rates somewhere? Say, 5-minute old instead of 15?
Many forex brokers offer free "informers" that autoload data in an interval of seconds, so maybe there's a few that allow this data to be downloaded in bigger intervals without the use of their informers strictly for personal use?
TrueFX has free real-time (multiple updates per second) forex quotes, but only for a limited number of pairs: http://webrates.truefx.com/rates/connect.html?f=html
They also have free downloadable tick data for the same pairs, going back to May 2009: http://truefx.com/?page=downloads
You can get real-time quotes for a larger selection of pairs from FXCM: http://rates.fxcm.com/RatesXML
Realtime rates for about 40 currency pairs are available here: http://1forge.com/forex-data-api, eg: https://1forge.com/forex-quotes/quotes
They also have free downloadable tick-data, going back to 2007, but you need to create a demo account and use a COM based Windows API called Order2Go to retrieve it.
They promised that they will make available the same tick data in CSV format for free sometime this year here: http://www.forexcodesource.com/index.php/Category:Historical_Data
Here are a bunch of equity/fx data providers, however they are not free.
http://finviz.com/store/market-data-providers.ashx
If you're trying to keep everything free, then you'll probably have to hack something together.
For example, in MT4 there is a DDE hook that you can use to broadcast the quotes. You'll need a windows box(or vm) running MT4 and an app listening to the DDE server, that would forward the quotes off to your linux server via a TCP socket, or even HTTP. The lag should be less than a second if done right.
Here's the .net library I use to receive the DDE quotes.
http://www.4xlab.net/cs/forums/136/ShowPost.aspx
Also, if you are looking for historical tick data, then this is a great source.
http://ratedata.gaincapital.com/
download metatrader from any broker, and write an expert adviser to log all the data you want to a file. have another process that read the file. if you really want to get fancy, you can call c functions from mt4 code. its not that hard to write some c code to store data to a db instead of logging it to a file.