Amadeus Hotel Search returning empty array on some cities - rest

im having the following error with my Amadeus API integration, with the following hit as example:
https://test.api.amadeus.com/v2/shopping/hotel-offers?cityCode=MAD&roomQuantity=1&adults=2&radius=5&radiusUnit=KM&paymentPolicy=NONE&includeClosed=false&bestRateOnly=true&view=FULL&sort=NONE
In this example, im trying to find hotels in the MAD IATA code, which can be either Barajas Airport, or Madrid itself. This throws response 200 (meaning OK), but the data is empty like no hotels.
Another example of this happening is in AMS (Amsterdam or the airport itself)
I thought it could be some crash between cities and airport with the same IATA code, but BCN (Barcelona and El Pratt airport) works fine.
Have anyone else faced this issue?

I just tried your example in test and it works for me:
https://test.api.amadeus.com/v2/shopping/hotel-offers?cityCode=MAD&roomQuantity=1&adults=2&radius=5&radiusUnit=KM&paymentPolicy=NONE&includeClosed=false&bestRateOnly=true&view=FULL&sort=NONE
"type": "hotel",
"hotelId": "BWMAD200",
"chainCode": "BW",
"dupeId": "700009576",
"name": "BEST WESTERN HOTEL LOS CONDES",
...
2 things to keep in mind:
You use the test environment, in this environment the data set is limited (enough to prototype) if you want to get access to the full set of data you will have to move to production.
You are doing a hotel search, it could that it didn't find available rooms at the moment you did the request for the provided parameters (link to the previous point where the data set is limited). You can play with the radius and the roomQuantity to find more hotels.

I'm seeing this too.
I cannot find a combination of fields that returns any data for any of the MAD or NYC city codes.
LON is the only city code that I can get search results from, but only if I specify dates earlier than April.
Searching for MAD without dates
Query:
https://test.api.amadeus.com/v2/shopping/hotel-offers?cityCode=MAD&roomQuantity=1&adults=2&radius=5&radiusUnit=KM&paymentPolicy=NONE&includeClosed=false&bestRateOnly=true&view=FULL&sort=NONE
Results:
{
"data": [],
"meta": {
"links": {
"next": "https://test.api.amadeus.com/v2/shopping/hotel-offers?adults=2&bestRateOnly=true&cityCode=MAD&includeClosed=false&paymentPolicy=NONE&radius=5&radiusUnit=KM&roomQuantity=1&sort=NONE&view=FULL&page[offset]=H0227D1ADVO9_100"
}
}
}
Searching for LON in April
Query:
https://test.api.amadeus.com/v2/shopping/hotel-offers?cityCode=LON&checkInDate=2020-04-04&checkOutDate=2020-04-09&roomQuantity=1&adults=2&radius=5&radiusUnit=KM&paymentPolicy=NONE&includeClosed=false&bestRateOnly=true&view=FULL&sort=NONE
Results:
{
"data": [],
"meta": {
"links": {
"next": "https://test.api.amadeus.com/v2/shopping/hotel-offers?adults=2&bestRateOnly=true&checkInDate=2020-04-04&checkOutDate=2020-04-09&cityCode=LON&includeClosed=false&paymentPolicy=NONE&radius=5&radiusUnit=KM&roomQuantity=1&sort=NONE&view=FULL&page[offset]=IHNILC3OTZSM_100"
}
}
}
Searching for LON in March
Query:
https://test.api.amadeus.com/v2/shopping/hotel-offers?cityCode=LON&checkInDate=2020-03-04&checkOutDate=2020-03-09&roomQuantity=1&adults=2&radius=5&radiusUnit=KM&paymentPolicy=NONE&includeClosed=false&bestRateOnly=true&view=FULL&sort=NONE
Results:
Success
I agree that it's not very clear which searches we should expect to be successful. Is there a date limitation that I'm not aware of? Which cities are supported?
The only information around what limitations we should expect from hotel searches in the test environment that I've seen seems to code from your test dataset repository:
The content of Hotel Search comes directly from the hotel providers, so the content might change dynamically. For your test, use big cities like LON (London) or NYC (New-York).

Related

How do I perform aggregate queries using SumoLogic APIs

I am trying to perform aggregate queries using SumoLogic APIs as mentioned here.
Something like:
_view = <some_view> | where sourceCategory matches \"something\" | sum(field) by sourceCategory
This works just fine in the Sumo GUI. I get a field in result called "_sum" which gives me the desired result.
However the same doesn't work when I do it using the SUMO APIs. If I create a job with this body:
{
"query": "_view = <some_view> | where sourceCategory matches "something" | sum(field) by sourceCategory",
"from": "start_timestamp",
"to": "end_timestamp",
"timeZone": "some_timezone"
}
I call the "v1/search/jobs" POST method with the above body and I do GET "v1/search/jobs/{job_id}" till the state is "DONE GATHERING RESULTS". Then I do "v1/search/jobs/{job_id}/messages". I was expecting to see aggregated values in the result, but instead I see something similar to:
{
"fields":[
{
"name":"_messageid",
"fieldType":"long",
"keyField":false
}, ...
],
"messages":[
{
"map":{
"_receipttime":"1359407350899",
"_size":"549",
"_sourcecategory":"service",
"_sourceid":"1640",
"the_field_i_mentioned":"not-aggregated-value"
"_messagecount":"2044"
}
}, ...
]
]
Thanks for going through my question. Any advices / work-arounds are appreciated. I don't really want to iterate manually through all items and calculate the sum. I'd prefer to do it on SumoLogic side itself. Thanks Again!
Explanation
Similar as in the User Interface, in the API for log searches you get both raw results (also referred to as messages) and the aggregate results (also referred to as records).
(Obviously, the latter are only returned if there's any aggregation in the query. In your case there is.)
Actual suggestion
Then I do "v1/search/jobs/{job_id}/messages"
Try /records instead.
See the docs for "Paging through the records found by a Search Job"
Disclaimer: I am currently employed by Sumo Logic.

What data source ID to use for Google Fit REST heart rate query?

I'm trying to retrieve aggregate daily heart rate summary data using the Google Fit REST API, but I'm struggling because either I'm missing something or the documentation seems to be very incomplete. I've successfully managed to retrieve aggregate daily step count by following one of the few available examples:
Request URL
https://www.googleapis.com/fitness/v1/users/me/dataset:aggregate
Request body
{
"aggregateBy": [{
"dataTypeName": "com.google.step_count.delta",
"dataSourceId": "derived:com.google.step_count.delta:com.google.android.gms:estimated_steps"
}],
"bucketByTime": { "durationMillis": 86400000 },
"startTimeMillis": 1438705622000,
"endTimeMillis": 1439310422000
}
I can't find any example for reading heart rate, so I'm trying to modify this for heart rate. I found this list of data types where it has this data type: com.google.heart_rate.summary but there isn't any information on what the dataSourceId should be. I tried just omitting it but I get this error:
no default datasource found for: com.google.heart_rate.summary
Does anybody know what I need to use for dataSourceId, or have a link to any decent documentation on data sources?
For resting heart rate, I use this:
"derived:com.google.heart_rate.bpm:com.google.android.gms:resting_heart_rate<-merge_heart_rate_bpm"
For heart rate or BPM, I use this:
"derived:com.google.heart_rate.bpm:com.google.android.gms:merge_heart_rate_bpm"
For completeness, I have included the datasources that I am using below for various readings:
DATA_SOURCE = {
"steps": "derived:com.google.step_count.delta:com.google.android.gms:merge_step_deltas",
"dist": "derived:com.google.distance.delta:com.google.android.gms:from_steps<-merge_step_deltas",
"bpm": "derived:com.google.heart_rate.bpm:com.google.android.gms:merge_heart_rate_bpm",
"rhr": "derived:com.google.heart_rate.bpm:com.google.android.gms:resting_heart_rate<-merge_heart_rate_bpm",
"sleep" : "derived:com.google.sleep.segment:com.google.android.gms:sleep_from_activity<-raw:com.google.activity.segment:com.heytap.wearable.health:stream_sleep",
"cal" : "derived:com.google.calories.expended:com.google.android.gms:from_activities",
"move": "derived:com.google.active_minutes:com.google.android.gms:from_steps<-estimated_steps",
"points" : "derived:com.google.heart_minutes:com.google.android.gms:merge_heart_minutes",
"weight" : "derived:com.google.weight:com.google.android.gms:merge_weight"
}
Depending on the datasource, sometimes it will provide an array of points. You can then choose to take sum, mean, median, etc of all points in the array accordingly.
You can list the data sources available for a given data type, for example :
Method
GET
Request URL
https://www.googleapis.com/fitness/v1/users/me/dataSources?dataTypeName=com.google.heart_rate.summary
Depending on what you're trying to achieve, you'll probably find a source either for com.google.heart_rate.summary or com.google.heart_rate.bpm to meet your needs, including merged sources.

Google DLP - Displaying the Region using InfoTypes.list()

After integrating the Google DLP API, the ListInfoTypes() currently returns the name, description, supported types of the infotypes present in the infotypes reference. Is it possible to also obtain the region for the infotypes like "Australia" or "Argentina" as a seperate field?
Currently this is my output:
"name": "AUSTRALIA_MEDICARE_NUMBER",
"displayName": "Australia medicare number",
"supportedBy": [
"INSPECT"
],
"description": "A 9-digit Australian Medicare account
I need the Region as well for example Region: "Australia" for every other infotypes.
I also got around to see locations.infoTypes.list() but I'm not sure which location I should enter in the filter to get any value.
Looking at the REST API there doesn't appear to be identifying data that can be formally used to determine the region. If we look at the InfoTypeDescription JSON structure found here:
https://cloud.google.com/dlp/docs/reference/rest/v2/ListInfoTypesResponse#InfoTypeDescription
we see that "name" is described as an "internal name of the InfoType". I wondered if we could depend on a structure of the string ... perhaps (.)*_.* as a regular expression grouping. While this might work, it shouldn't be relied upon without investigation of more samples and the docs don't describe the structure.
If you really need a solution, my recommendation would be to dump ALL the InfoTypes and then manually group the "name" fields into the regions of interest to you. You could then store this as CSV or JSON and have a reference piece of data that you could use in your app and regenerate as needed.
It's a great feature request I'll forward to the team. In the short term you can hack the name as ones that are regional will say they are in their name.

Can you list multiple features within the same Schema.org "LocationFeatureSpecification"?

I am working on Schema.org Resort schema for a ton of resorts on a travel website and am trying to find the most efficient ways of filling out the schema with regards to amenities.
The current code looks something like this:
"amenityFeature": [
{
"#type":"http://schema.org/LocationFeatureSpecification",
"name":"Spa",
"value":"true"
},
{
"#type":"http://schema.org/LocationFeatureSpecification",
"name":"Internet Access",
"value":"true"
},
{
"#type":"http://schema.org/LocationFeatureSpecification",
"name":"Tennis Courts",
"value":"true"
}
]
My question is, can I write it like this instead to shorten lines of code:
{
"#type":"http://schema.org/LocationFeatureSpecification",
"name":[
"Spa", "Internet Access", "Tennis Courts"
],
"value":"true"
}
When I test it in Google’s Structured Data Testing Tool, it doesn’t give any errors. Here is what it looks like in the SDTT when I write it the short way:
And here is what it looks like if I do it the first/long way:
If I do it the short way, I want to make sure all those items are getting listed as amenities and not just different names for the same amenity. Otherwise, I'll go the long route.
No, each LocationFeatureSpecification represents one feature:
Specifies a location feature by providing a structured value representing a feature of an accommodation as a property-value pair of varying degrees of formality.
Your second snippet would represent one feature with multiple names.

Bing Maps Ignores Spatial Filter?

I have a good grasp on Bing's REST service, but, I'm really stumped on this one.
What I'm attempting to do is get a grocery store ($filter=5400) within a polygon located in a Florida census tract ($spatialFilter), but the results are from Massachusetts!
The URL is (I didn't supply a Bing key for obvious reasons :-)
http://spatial.virtualearth.net/REST/v1/data/f22876ec257b474b82fe2ffcb8393150/NavteqNA/NavteqPOIs?$format=json&$top=1&$filter=EntityTypeID%20Eq%20%275400%27&$spatialFilter=intersection(POLYGON%20((-81.190439%2028.590798999999997,%20-81.193080999999992%2028.590759,%20-81.196646%2028.590698999999997,%20-81.198315999999991%2028.590671,%20-81.204715%2028.590566,%20-81.204828999999989%2028.590767,%20-81.20603899999999%2028.592836,%20-81.206306%2028.593291999999998,%20-81.206443999999991%2028.593528,%20-81.207657%2028.593486,%20-81.207929%2028.595012999999998,%20-81.20795%2028.594935,%20-81.207956%2028.594918,%20-81.208027%2028.594707,%20-81.208052999999992%2028.594631999999997,%20-81.20811599999999%2028.594452,%20-81.208207%2028.594196999999998,%20-81.208302%2028.593913999999998,%20-81.208364%2028.593733999999998,%20-81.208396999999991%2028.593638,%20-81.208413999999991%2028.593586,%20-81.208429999999993%2028.593541,%20-81.208523%2028.593269,%20-81.208565%2028.593144,%20-81.208615999999992%2028.592997,%20-81.208655999999991%2028.592879,%20-81.208713%2028.592713,%20-81.20877%2028.592523999999997,%20-81.208806%2028.592405,%20-81.208844%2028.592271999999998,%20-81.208923%2028.592004,%20-81.208951%2028.591872,%20-81.208981%2028.591738,%20-81.209%2028.591641,%20-81.209008%2028.591566999999998,%20-81.209032999999991%2028.591364,%20-81.209049999999991%2028.59114,%20-81.209049%2028.591048999999998,%20-81.209049%2028.590875999999998,%20-81.209042%2028.590608,%20-81.209042%2028.590595,%20-81.209027999999989%2028.590414,%20-81.208998999999991%2028.590194,%20-81.20894%2028.589881,%20-81.208924%2028.589817,%20-81.20886%2028.589558,%20-81.208777%2028.589311,%20-81.208744%2028.589212999999997,%20-81.208588999999989%2028.588699,%20-81.208544%2028.588565,%20-81.208461%2028.588319,%20-81.208423%2028.588206999999997,%20-81.208311%2028.587871999999997,%20-81.208274%2028.587761,%20-81.208201%2028.587557999999998,%20-81.208074%2028.587204,%20-81.207997999999989%2028.586944,%20-81.207973%2028.5868559999999&key=<BING_KEY>
What I'm getting back shouldn't be:
{
"d": {
"results": [
{
"__metadata": {
"uri": "https://spatial.virtualearth.net/REST/v1/data/f22876ec257b474b82fe2ffcb8393150/NavteqNA/NavteqPOIs('1001002038')"
},
"EntityID": "1001002038",
"Name": "Nosso Brazil",
"DisplayName": "Nosso Brazil",
"AddressLine": "25 Boston Post Rd",
"Locality": "Marlborough",
"AdminDistrict2": "Middlesex",
"AdminDistrict": "Massachusetts",
"PostalCode": "01752",
"CountryRegion": "USA",
"Latitude": 42.35173,
"Longitude": -71.52983,
"Phone": "508-3032424",
"EntityTypeID": "5400"
}
]
}
}
From my estimation, Bing is returning the first grocery store at Bing 5400 and completely ignoring $spatialFilter parameter, can anyone determine how to return something other than what's returned? Meaning, can anyone return a grocery store within the defined polygon in Florida?
There are a bunch of issues with your query URL.
The first issue is that the spatial filter shouldn't start with $. As such the URL is falling back on the standard filter and grabbing the first result in the world that matches that filter value.
The second issue is that the spatial filter is not supported on the NavteqNA, NavteqEU, and FourthCoffeeSample data sources. The reason for this is that these data sources as significantly larger than the largest custom data source. Performing these types of complex queries on these large data sources would be really slow. As such this type of query has been disabled for these data sources. This is also why when looking at the Query URL samples in the documentation these data sources aren't used in the samples.
The third issues is that the Polygon string is incomplete. It appears that once a Bing Maps key is added to that URL the total length of the URL is 2083 characters which is the limit supported by browsers. This is likely why your Polygon text is cut off. A couple of tips to help prevent this, reduce the number of decimal places that are in your string. 5 decimal places has an accuracy of +/- 0.17 meters which is likely accurate enough for your application. Some of your numbers have 15 decimal places, so this is potentially 10 characters per number that you could eliminate.
If you have your well known text for the polygon already in code, you can use a simple regular expression to find and replace this. Use the following pattern:
([0-9].[0-9]{5})([0-9])
and replace it with
$1
This will remove all numbers after 5 decimal places. You can further optimize the URL by removing the spaces after the comma's as they are not needed. By doing these two things you could cut the length of the URL in half.
Since the polygon is cut off the Well Known text is invalid. To be valid the polygon must end with the same coordinate that it starts with.