Query to retrieve stock quotes variation from a single day - rest

I'm quite new YQL and i've found the query to retrieve a single quote from a stock
select * from yahoo.finance.quote symbol = "YHOO"
and another query to get this same information but on date range
select * from yahoo.finance.historicaldata symbol = "YHOO" and startDate = "2016-09-01" and endDate = "2016-09-22"
What i could not figure out was: how could we retrieve quotes from a full day of trading?
I'm currently using the Yahoo finance app and notice they provide a good graphic about the price variation, so i presume there is a way to achieve it.
I also tried to read yql tables repository but on both table that i am using there is no (at least explicit) clue of how to pass hour range.

You can retrieve the complete quotes of a day by querying the Yahoo Finance API endpoint directly (not via YQL) and receiving a list in JSON format.
The end point is http://chartapi.finance.yahoo.com/instrument/1.0/$symbol/chartdata;type=$type;range=$range/json/, where:
$symbol is the stock ticker symbol, e.g. AAPL for Apple or BAS.DE for BASF traded at Xetra
$type is the type of the query, you can query for quote, sma, close, volume
$range is the desired latest days with 1d, 5d, 10d, 15d
An example query would be
http://chartapi.finance.yahoo.com/instrument/1.0/aapl/chartdata;type=quote;range=1d/json/
which gives you all quotes from AAPL from the last day.
As far as I know, you can only query for the quotes up to the last 15 days. I have not yet found a way to query for some other day further in the past.
Just my self-centric hint: check out my PHP package YahooFinanceQuery on Github, which uses an implementation of the above query and handles the returning JSON to filter the results.

As an update/extension to my previous answer I found a new API endpoint to download daily quotes.
Yahoo changed their API endpoints in early 2017.
The new endpoint is:
https://query1.finance.yahoo.com/v8/finance/chart/{$symbol}?range={$range}&interval={$interval}, where:
$symbol is the stock ticker symbol, e.g. AAPL for Apple
$range is the desired range of the query, allowed parameters are [1d, 5d, 1mo, 3mo, 6mo, 1y, 2y, 5y, 10y, ytd, max]
$interval is the desired interval of the quote, e.g. every 5 minutes, allowed parameters are [1m, 2m, 5m, 15m, 30m, 60m, 90m, 1h, 1d, 5d, 1wk, 1mo, 3mo]
An example would be: https://query1.finance.yahoo.com/v8/finance/chart/AAPL?range=10d&interval=1m where you receive OHLCV quotes for the AAPL stock from the last 10 trading days with a 1 minute interval. All in a nicely JSON format.
Not all $range parameters will return results with the specified $interval, but will return the nearest possible combination. For example, the "max" range will return all quotes with a "1mo" interval.

Related

Powerapps Filter Collection By Today's Date

Good day all,
I am trying to filter todays result in SQL table to a collection in powerapps. The column "dt" represents the column in sql of datetime type.
This is my powerapps filter:
ClearCollect(myCollectionName, Filter(myDatasource, Text(dt,"dd/mm/yyyy") = Text(Now(),"dd/mm/yyyy" )));
Seems like the collection is still empty even there is data for today in sql. May I know if my approach is the correct way in filtering?
Short answer: the data is likely being changed based on the client time zone. To fix it, you can update it by applying the time zone offset to the data from the SQL table, something along the lines of:
ClearCollect(
myCollectionName,
Filter(
myDatasource,
Text(DateAdd(dt, TimeZoneOffset(dt), Minutes), "dd/mm/yyyy") =
Text(Now(), "dd/mm/yyyy")))
Long(er) answer: the datetime type in SQL Server represents an absolute value of date and time. For example, the value '2021-12-23 09:30:00' represents 9:30 in the morning of the 23rd day of December, 2021 - at any part of the world. The date/time type in Power Apps, however, represents a point in time, typically referring to the local time where the app is being executed (or created). For example, if I selected that value and I'm in the US Pacific Time Zone (UTC-08:00), that would represent the same value as if someone in London (UTC+00:00) selected 2021-12-23 17:30:00. Since the two types represent different concepts, we may have mismatches like you are facing. To fix this, we can either use a type in SQL Server that has the same semantics as Power Apps (for example, 'datetimeoffset'), or adjust the time when it is being transferred between SQL and Power Apps.
The blog post at https://powerapps.microsoft.com/en-us/blog/working-with-datetime-values-in-sql explains in more details how to work with date/time values in SQL and Power Apps.

Using REST to filter Share Point date field

I need to filter the date in wish an item was received in a SharePoint list using REST. I have it partially working, but I have a problem. I'm using a Datepicker to select the date I'm using as filter, but when the date is picked and converted to an acceptable format using toISOString(), I get an "exact" date (e.g. 2018-06-15T00:00:00.000Z). The issue with that is that the Date portion (2018-06-15) matches OK, but not the Time portion, because the datepicker will always give a different time (HH:MM:SS) than the SharePoint list entry.
This is what I'm using:
... $filter=Date_Received eq datetime'2018-05-11T04:00:00.000Z'
I have several items that were entered on 6/15/2018, but I get no values, because the time they were entered was different (e.g. 2018-06-15T05:00:00Z). Is there a way to use something like substringof to filter dates, or does anyone has a workaround?
Thanks in advance.
Dates in SharePoint are stored in GMT. So the times recorded are London times.
Option 1 - Use ranges:
$filter=Date_Received gt datetime'2018-05-10T20:00:00.000Z' and Date_Received lt datetime'2018-05-11T20:00:00.000Z'
If you are not looking for files near midnight then:
$filter=Date_Received gt datetime'2018-05-11T00:00:00.000Z' and Date_Received lt datetime'2018-05-11T23:59:59.000Z'
Option 2 - Use the SharePoint 2010 REST API and date functions:
/sites/yourSite/_vti_bin/listdata.svc/yourList?$filter=year(Date_Received) eq 2018 and month(Date_Received) eq 5 and day(Date_Received) eq 11
This still has an issue for dates around midnight due to GMT.
Option 3 - Use SharePoint 2010 REST API with a calculated column:
Add a Calculated column named Date_Received_Text as: =TEXT(Date_Received,"yyyy-mm-dd")
/sites/yourSite/_vti_bin/listdata.svc/yourList?$filter=Date_Received_Text eq '2018-07-08'
Same midnight issue...
Note: The 2010 API still works in SharePoint Online.

Elasticsearch: queries with dynamic "Date range" filter, using value of a date field, with Kibana

I need to visualize the following, on the data of an ELK index, where 1 row equals one login on a website:
Given a keyword field userID, and a date field date, I would like to see "how many times user userID has logged in, between the date of the login, and x days before".
To solve the second half of the vizualisation, I need to do an aggregation that would allow me to display the count of a "date range", except I use the dynamic value of date.
Using a "date histogram" does not quite do the job. I need to do this extract for Machine Learning purposes, and the algorithm needs the count not to be a daily or a weekly histogram, but a count from the exact date, for each row.
Do you have advice or examples you could give me, regarding this query?

Connecting BigQuery and Google Sheets - DATE parameter issue

following 1 I started creating a Spreadsheet which reads data from BigQuery, but I'm having an issue handling parameters related to date values.
In the first sheet, I created 2 cells with 2 parameters, the start and the end of a date interval, with proper values. Both cells are formatted as "Date" value.
In the second sheet I configured BigQuery connector, for this example, I'm using a public dataset with dates. bigquery-public-data.utility_eu.date_greg
From the BigQuery connector wizard I added:
"STARTDATE" as "PARAMETERS!B1"
"ENDDATE" as "PARAMETERS!B2"
After this configuration, this is the resulting query:
SELECT
date,
date_str,
date_int
FROM `bigquery-public-data.utility_eu.date_greg`
WHERE date > DATE(#STARTDATE) AND date < DATE(#ENDDATE)
LIMIT 10
I'm getting an error directly from the editor with this message:
> Error BigQuery: No matching signature for function DATE for argument types: INT64. Supported signatures: DATE(TIMESTAMP, [STRING]); DATE(DATETIME); DATE(INT64, INT64, INT64) at [8:14]
As far as I can understand, the "date" cells are retrieved as a number, so the direct parse is not working. After a couple of tests, I understood the that given int value is the number I can obtain change cell format to "number".
If you convert cell value from DATE to NUMBER you get this value:
01/05/2019 -> 43.586
31/05/2019 -> 43.616
What is this number? It is not milliseconds, it increases by 1 every next day. In order to create the proper query that can parse this int, I need to understand what is this int (of course I can handle the cell as "text" and writing the timestamp value directly, but I would prefer to have the native date format so I can use the built-in calendar.
My consideration (with simple math) is that this number refers to a number of days since 30/12/1899, but it is very odd (also, every date BEFORE this days is always 0), so I'm asking you directly how to handle this value. Basing on my understanding of when the number counter starts (30/12/1899), I created this query which add the number retrieved from the cell:
SELECT *
FROM `bigquery-public-data.utility_eu.date_greg`
WHERE
date >= DATE_ADD(DATE("1899-12-30"), INTERVAL #DATAINIZIO DAY)
AND date <= DATE_ADD(DATE("1899-12-30"), INTERVAL #DATAFINE DAY)
It is working... but I think I'm doing a workaround that is not the proper way of doing this.
Also, is there any full documentation related to this BigQuery connection provided by Spreadsheet? Besides presentation in 1 I'm unable to find any specific documentation.
Spreadsheets (Google, Excel, ...) store the dates as days passed since a starting date with a fractional day representing time.
From here: "Excel stores dates and times as a number representing the number of days since 1900-Jan-0, plus a fractional portion of a 24 hour day: ddddd.tttttt . This is called a serial date, or serial date-time."
Now, you have to ways to filter by date on your Query:
In the query, you can use DATE_ADD to add your number of days (cell value) to the base date. (Carefull, DATE_ADD takes INT, and the date value is float so needs prior casting).
(preferred) on your spreadsheet you use TEXT(cell, "yyyy-mm-dd") so you can then use DATE() in the BigQuery query.
I use the second method as, though you need that extra cell (unless you directly store the date as YYYY-MM-DD; keeps the query cleaner than having a cast and date_add in there. Also would save you from the "1904 problem" explained in the link above.
What is this number? It is not milliseconds, it increases by 1 every next day.
This is so called serial number which represent number of days since "very beginning"
Google's Spreadsheet date calendar starts from 1900-01-01 - which is treated as a "very beginning"
In order to create the proper query that can parse this int, I need to understand what is this int
Armed with above info you can adjust you dates calculation to be in sync with what BigQuery expects
You mentioned that your fields are already in Date format, maybe you are doing an extra parsing in your query.
Try to do it without the DATE functions.
Also, I found this other doc, not merely related to connection, but might be helpful: Getting info from Spreadsheets with BigQuery.

How to handle dates in neo4j

I'm an historian of medieval history and I'm trying to code networks between kings, dukes, popes etc. over a period of time of about 50 years (from 1220 to 1270) in medieval Germany. As I'm not a specialist for graph-databases I'm looking for a possibility to handle dates and date-ranges.
Are there any possibilities to handle over a date-range to an edge so that the edges, which represents a relationship, disappears after e.g. 3 years?
Are there any possibility to ask for relationships who have their date-tag in a date-range?
The common way to deal with dates in Neo4j is storing them either as a string representation or as millis since epoch (aka msec passed since Jan 01 1970).
The first approach makes the graph more easily readable the latter allows you to do math e.g. calculate deltas.
In your case I'd store two properties called validFrom and validTo on the relationships. You queries need to make sure you're looking for the correct time interval.
E.g. to find the king(s) in charge of France from Jan 01 1220 to Dec 31st 1221 you do:
MATCH (c:Country{name:'France'})-[r:HAS_KING]->(king)
WHERE r.validFrom >= -23667123600000 and r.validTo <=-23604051600000
RETURN king, r.validFrom, r.validTo
addendum
Since Neo4j 3.0 there's the APOC library which provides couple of functions for converting timestamps to/from human readable date strings.
You can also store the dates in their number representation in the following format: YYYYMMDD
In your case 12200101 would be Jan 1st 1220 and 12701231 would be Dec 31st 1270.
It's a useful and readable format and you can perform range searches like:
MATCH (h:HistoricEvent)
WHERE h.date >= 12200101 AND h.date < 12701231
RETURN h
It would also let you order by dates, if you need to.
As of Neo4J 3.4, the system handles duration and dates, see the official documentation. See more examples here.
An example related to the original question: Retrieve the historical events that happened in the last 30 days from now :
WITH duration({days: 30}) AS duration
MATCH (h:HistoricEvent)
WHERE date() - duration < date(h.date)
RETURN h
Another option for dates that keeps the number of nodes/properties you create fairly low is a linked list years (earliest year of interest - latest year), one of months (1-12), and one of dates in a month (1-31). Then every "event" in your graph can be connected to a year, month, and day. This way you don't have to create a new node for every new combination of a year month and day. You just have a single set of months, one of days, and one year. I scale the numbers to make manipulating them easier like so
Years are yyyy*10000
Months are mm*100
Date are dd
so if you run a query such as
match (event)-[:happened]->(t:time)
with event,sum(t.num) as date
return event.name,date
order by date
You will get a list of all events in chronological order with dates like Janurary 17th, 1904 appearing as 19040117 (yyyymmdd format)
Further, since these are linked lists where, for example,
...-(t0:time {num:19040000})-[:precedes]->(t1:time {num:19050000})-...
ordering is built into the nodes too.
This is, so far, how I have liked to do my event dating