Am in the middle of using uber api(Mobile) to get the estimated fare for china region.
Am using Baidu map to get the coordinate between From: 龙居大厦
To: 虹桥火车站-地铁站. So my request to get estimate price becomes:
https://api.uber.com.cn/v1/estimates/price?start_latitude=31.263004&start_longitude=121.565063&end_latitude=31.192377&end_longitude=121.334137&server_token=R52QOQyjVVqgCqds1hVxTtyT7YSRjNZY6qP4Dcnz
Response: {
"currency_code" = CNY;
"display_name" = "People's Uber +";
distance = "17.61";
duration = 2880;
estimate = "CN\U00a574-99";
"high_estimate" = 99;
"localized_display_name" = "People's Uber +";
"low_estimate" = 74;
minimum = 13;
"product_id" = "c9ded892-05bf-4efb-8056-3301bc65a3e7";
"surge_multiplier" = 1; },
{
"currency_code" = CNY;
"display_name" = "Shared Ride";
distance = "17.61";
duration = 2880;
estimate = "CN\U00a575.84";
"high_estimate" = 76;
"localized_display_name" = "Shared Ride";
"low_estimate" = 75;
minimum = "<null>";
"product_id" = "74d2f8af-5027-4d42-960e-bc879f8ea54b";
"surge_multiplier" = 1; }, etc (ignoring uberx, uber sedan, UberXl)
problem is estimated price is mismatching with real app fare (Added link for screenshot. Uber real app fare screenshot). So Can someone please help me in clearing below doubt.
Is my request is correct to get the estimated price.
Why Uber API response fare is mismatching compared to real app.
How uber calculates the single price and uses "&UP", using the high and low estimated price.
Waiting for your speedy response. Thanks in advance
1) The difference between the two estimates can be explained as we use different signals when providing an estimate (v1/estimates/price) vs providing an upfront fare id (/v1/requests/estimates).
To request a fare_id (which has the most accurate fare) see the docs for /v1/requests/estimates - https://developer.uber.com/docs/riders/references/api/v1.2/requests-estimate-post
The v1 price estimates endpoint does not always reflect promotions that may be available in the local market. This is resolved in the v1.2 endpoint:
https://developer.uber.com/docs/riders/references/api/v1.2/estimates-price-get
I would recommend upgrading to v1.2 to resolve the issue with /v1/estimates/price or if this is for an authenticated user provide an upfront fare with /v1.2/requests/estimate.
I also wanted to confirm you know we are shutting down the .cn API on November 29th.
Related
I'm trying to query data from the OSM Overpass API. Specifically I'm trying to determine the count of amenities of a given type around a point (using the 'around' syntax). When running this for many locations (lat, lons) I'm running into a TooManyRequests error.
I have tried to work around by setting sleep time pauses and playing with the timeout header and retry time, but I'm running into the same issue. I'm trying to find a way to adapt the query so that it just returns the count of amenities (of specified type) around each point, rather than the full json of nodes which is more data intensive. My current script is as follows;
# Running Overpass query for each point
results = {}
for n in range(0, 200):
name = df.loc[n]['city']
state = df.loc[n]['state_name']
rad = df.loc[n]['radius_m']
lat = df.loc[n]['lat']
lon = df.loc[n]['lng']
# Overpass query for amenities
start_time = time.time()
api = overpy.Overpass(max_retry_count=None, retry_timeout=2)
r = api.query(f"""
[out:json][timeout:180];
(node["amenity"="charging_station"](around:{rad}, {lat}, {lon});
);
out;
""")
print("query time for "+str(name)+", number "+str(n)+" = "+str(time.time() - start_time))
results[name] = len(r.nodes)
time.sleep(2)
Any help is much appreciated from other Overpass users!
Thanks
In general, you can run out count; to return a count from an overpass API query.
It's hard to say without knowing how your data is specifically structured, but you might have better luck using area to look at specific cities, or regions.
Here is an example that returns the count of all nodes tagged as charging station in Portland, Oregon:
/* charging stations in portland */
area[name="Oregon"]->.state;
area[name="Portland"]->.city;
(
node["amenity"="charging_station"](area.state)(area.city);
);
out count;
I'm estimating last mile delivery costs in an large urban network using by-route distances. I have over 8000 customer agents and over 100 retail store agents plotted in a GIS map using lat/long coordinates. Each customer receives deliveries from its nearest store (by route). The goal is to get two distance measures in this network for each store:
d0_bar: the average distance from a store to all of its assigned customers
d1_bar: the average distance between all customers common to a single store
I've written a startup function with a simple foreach loop to assign each customer to a store based on by-route distance (customers have a parameter, "customer.pStore" of Store type). This function also adds, in turn, each customer to the store agent's collection of customers ("store.colCusts"; it's an array list with Customer type elements).
Next, I have a function that iterates through the store agent population and calculates the two average distance measures above (d0_bar & d1_bar) and writes the results to a txt file (see code below). The code works, fortunately. However, the problem is that with such a massive dataset, the process of iterating through all customers/stores and retrieving distances via the openstreetmap.org API takes forever. It's been initializing ("Please wait...") for about 12 hours. What can I do to make this code more efficient? Or, is there a better way in AnyLogic of getting these two distance measures for each store in my network?
Thanks in advance.
//for each store, record all customers assigned to it
for (Store store : stores)
{
distancesStore.print(store.storeCode + "," + store.colCusts.size() + "," + store.colCusts.size()*(store.colCusts.size()-1)/2 + ",");
//calculates average distance from store j to customer nodes that belong to store j
double sumFirstDistByStore = 0.0;
int h = 0;
while (h < store.colCusts.size())
{
sumFirstDistByStore += store.distanceByRoute(store.colCusts.get(h));
h++;
}
distancesStore.print((sumFirstDistByStore/store.colCusts.size())/1609.34 + ",");
//calculates average of distances between all customer nodes belonging to store j
double custDistSumPerStore = 0.0;
int loopLimit = store.colCusts.size();
int i = 0;
while (i < loopLimit - 1)
{
int j = 1;
while (j < loopLimit)
{
custDistSumPerStore += store.colCusts.get(i).distanceByRoute(store.colCusts.get(j));
j++;
}
i++;
}
distancesStore.print((custDistSumPerStore/(loopLimit*(loopLimit-1)/2))/1609.34);
distancesStore.println();
}
Firstly a few simple comments:
Have you tried timing a single distanceByRoute call? E.g. can you try running store.distanceByRoute(store.colCusts.get(0)); just to see how long a single call takes on your system. Routing is generally pretty slow, but it would be good to know what the speed limit is.
The first simple change is to use java parallelism. Instead of using this:
for (Store store : stores)
{ ...
use this:
stores.parallelStream().forEach(store -> {
...
});
this will process stores entries in parallel using standard Java streams API.
It also looks like the second loop - where avg distance between customers is calculated doesn't take account of mirroring. That is to say distance a->b is equal to b->a. Hence, for example, 4 customers will require 6 calculations: 1->2, 1->3, 1->4, 2->3, 2->4, 3->4. Whereas in case of 4 customers your second while loop will perform 9 calculations: i=0, j in {1,2,3}; i=1, j in {1,2,3}; i=2, j in {1,2,3}, which seems wrong unless I am misunderstanding your intention.
Generally, for long running operations it is a good idea to include some traceln to show progress with associated timing.
Please have a look at above and post results. With more information additional performance improvements may be possible.
For Asset Allocation in R using Portfolio Analytics, is there a way to set risk as constant number, then optimize portfolio returns? For example, to maintain VaR always at 5% (conservative), how do weights of 8 assets change in portfolio to max return? In contrast, how do the weights change compared to a risky (VaR =20%) portfolio? In the Portfolio Analytics package, we can only set min risk as objective, but not set risk as a constant number. (Different from Equal Risk Contribution)
Is this what you are looking for?
add.objective(portfolio = base_pf, type = 'VaR', name = 'var', target=0.05)
or
add.objective(portfolio = base_pf, type = 'VaR', name = 'var', target=0.2)
I'm using the geocoder gem in rails to get latitude & longitude for addresses and then display them in OpenStreetMap.
When I search for my old address:
>> results = Geocoder.search("1000 Mount Curve Ave E, Altadena, CA 91001")
I get:
>> results.first.coordinates
=> [34.1976645, -118.1278219]
Mount Curve Address Discrepancy
Those coordinates are perhaps a thousand feet off. (See image.) The resulting accurate coordinates from Google Maps are [34.200503,-118.1310407].
I've tried another address and it was much farther off, perhaps a mile. (1346 E Woodbury Rd, Pasadena, CA 91104)
I've tried yet another address, and it was pretty much dead-accurate. (922 E Brockton Ave, Redlands, CA 92374)
Does anyone know what might be causing these inaccuracies and how to get accurate results consistently?
Because of the inaccuracies and/or limitations with OSM/Nominatim, I switched to Google maps service. At the top of my controller I added:
require 'google_maps_service'
And in my controller's show routine I ended up with this, which yields accurate results:
source = Property.find(params[:id])
#property = PropertyDecorator.new(source)
gmaps = GoogleMapsService::Client.new(key: '[removed, put in your own key here]')
results = gmaps.geocode("#{#property.address1} #{#property.address2}")
if results[0] == nil
#lat = 0
#lng = 0
else
#lat = results[0][:geometry][:location][:lat]
#lng = results[0][:geometry][:location][:lng]
end
I've been looking at this tutorial http://www.codediesel.com/php/reading-google-analytics-data-from-php/ but am struggling as I only want to retrieve the total number of unique visitors which is a metric.
It appears that you have to use a 'dimension' to get the 'metric' when using their API.
Any ideas on how I can avoid using a dimension as all I want is the total.
Thanks
Dimension is not mandatory.
Use:
Metric: ga:visitors and do mention the start-date and end-date to get the data in the desired range, which is required.
You will get the desired total.( P.S. The calculation of ga:visitors has been changed to return the number of unique visitors across the date range )
$analytics = new Google_Service_Analytics($client);
$profileId = 'UA-00000000-1';
$yesterday=$startDate=$endDate=date("Y-m-d", time() - 60 * 60 * 24 );
$metrics = 'sessions';
$results = $analytics->data_ga->get('ga:'.$profileId, $startDate, $endDate, 'ga:'.$metrics);
Requires - https://github.com/google/google-api-php-client