Parse Static Google Transit Data? - swift

Does anyone have any recommendations on how I could parse this static google transit data (in swift)?
http://web.mta.info/developers/data/nyct/subway/google_transit.zip
It provides a data for a bunch of the NYC Subway lines that aren't yet tracked in real time, however, I am having difficulty seeing how I'd be process this data (as in, incrementing through the arrivals of each train at a station)
Any recommendations?

Related

Firebase analytics - Unity - time spent on a level

is there any possibility to get exact time spent on a certain level in a game via firebase analytics? Thank you so much 🙏
I tried to use logEvents.
The best way to do so would be measuring the time on the level within your codebase, then have a very dedicated event for level completion, in which you would pass the time spent on the level.
Let's get to details. I will use Kotlin as an example, but it should be obvious what I'm doing here and you can see more language examples here.
firebaseAnalytics.setUserProperty("user_id", userId)
firebaseAnalytics.logEvent("level_completed") {
param("name", levelName)
param("difficulty", difficulty)
param("subscription_status", subscriptionStatus)
param("minutes", minutesSpentOnLevel)
param("score", score)
}
Now see how I have a bunch of parameters with the event? These parameters are important since they will allow you to conduct a more thorough and robust analysis later on, answer more questions. Like, Hey, what is the most difficult level? Do people still have troubles on it when the game difficulty is lower? How many times has this level been rage-quit or lost (for that you'd likely need a level_started event). What about our paid players, are they having similar troubles on this level as well? How many people have ragequit the game on this level and never played again? That would likely be easier answer with sql at this point, taking the latest value of the level name for the level_started, grouped by the user_id. Or, you could also have levelName as a UserProperty as well as the EventProperty, then it would be somewhat trivial to answer in the default analytics interface.
Note that you're limited in the number of event parameters you can send per event. The total number of unique parameter names is limited too. As well as the number of unique event names you're allowed to have. In our case, the event name would be level_completed. See the limits here.
Because of those limitations, it's important to name your event properties in somewhat generic way so that you would be able to efficiently reuse them elsewhere. For this reason, I named minutes and not something like minutes_spent_on_the_level. You could then reuse this property to send the minutes the player spent actively playing, minutes the player spent idling, minutes the player spent on any info page, minutes they spent choosing their upgrades, etc. Same idea about having name property rather than level_name. Could as well be id.
You need to carefully and thoughtfully stuff your event with event properties. I normally have a wrapper around the firebase sdk, in which I would enrich events with dimensions that I always want to be there, like the user_id or subscription_status to not have to add them manually every time I send an event. I also usually have some more adequate logging there Firebase Analytics default logging is completely awful. I also have some sanitizing there, lowercasing all values unless I'm passing something case-sensitive like base64 values, making sure I don't have double spaces (so replacing \s+ with " " (space)), maybe also adding the user's local timestamp as another parameter. The latter is very helpful to indicate time-cheating users, especially if your game is an idler.
Good. We're halfway there :) Bear with me.
Now You need to go to firebase and register your eps (event parameters) into cds (custom dimensions and metrics). If you don't register your eps, they won't be counted towards the global cd limit count (it's about 50 custom dimensions and 50 custom metrics). You register the cds in the Custom Definitions section of FB.
Now you need to know whether this is a dimension or a metric, as well as the scope of your dimension. It's much easier than it sounds. The rule of thumb is: if you want to be able to run mathematical aggregation functions on your dimension, then it's a metric. Otherwise - it's a dimension. So:
firebaseAnalytics.setUserProperty("user_id", userId) <-- dimension
param("name", levelName) <-- dimension
param("difficulty", difficulty) <-- dimension (or can be a metric, depends)
param("subscription_status", subscriptionStatus) <-- dimension (can be a metric too, but even less likely)
param("minutes", minutesSpentOnLevel) <-- metric
param("score", score) <-- metric
Now another important thing to understand is the scope. Because Firebase and GA4 are still, essentially just in Beta being actively worked on, you only have user or hit scope for the dimensions and only hit for the metrics. The scope basically just indicates how the value persists. In my example, we only need the user_id as a user-scoped cd. Because user_id is the user-level dimension, it is set separately form the logEvent function. Although I suspect you can do it there too. Haven't tried tho.
Now, we're almost there.
Finally, you don't want to use Firebase to look at your data. It's horrible at data presentation. It's good at debugging though. Cuz that's what it was intended for initially. Because of how horrible it is, it's always advised to link it to GA4. Now GA4 will allow you to look at the Firebase values much more efficiently. Note that you will likely need to re-register your custom dimensions from Firebase in GA4. Because GA4 is capable of getting multiple data streams, of which firebase would be just one data source. But GA4's CDs limits are very close to Firebase's. Ok, let's be frank. GA4's data model is almost exactly copied from that of Firebase's. But GA4 has a much better analytics capabilities.
Good, you've moved to GA4. Now, GA4 is a very raw not-officially-beta product as well as Firebase Analytics. Because of that, it's advised to first change your data retention to 12 months and only use the explorer for analysis, pretty much ignoring the pre-generated reports. They are just not very reliable at this point.
Finally, you may find it easier to just use SQL to get your analysis done. For that, you can easily copy your data from GA4 to a sandbox instance of BQ. It's very easy to do.This is the best, most reliable known method of using GA4 at this moment. I mean, advanced analysts do the export into BQ, then ETL the data from BQ into a proper storage like Snowflake or even s3, or Aurora, or whatever you prefer and then on top of that, use a proper BI tool like Looker, PowerBI, Tableau, etc. A lot of people just stay in BQ though, it's fine. Lots of BI tools have BQ connectors, it's just BQ gets expensive quickly if you do a lot of analysis.
Whew, I hope you'll enjoy analyzing your game's data. Data-driven decisions rock in games. Well... They rock everywhere, to be honest.

How to Identify values from characteristic data for Walk Run Stride bluetooth sensor

I am developing a workout app with sensor connectivity and able to read and get data for Heart rate sensor but for stride sensors (walk/Run) facing problem to map the values given by sensor characteristic.
How will I get Speed, cadence, steps per mints , Distance.?
I searched on google didn't get for this.
I am pretty much sure getting data but difficulty in mapping index and values for different parameters.
Check attached code snapshot for data output.
Thanks in Advance...!!
This question should not be tagged cadence, unless I'm missing something? That tag if for "A global provider of Electronic Design Automation (EDA) software and engineering services."
PS: I can't comment yet, please resolve this and delete this answer.

Waiting for results of asynchronous function call in matlab

So i'm trying to fetch historical stock data from IQFeed. I have a list of symbols I want to fetch data for. The problem is that the IQFeed timeseries function returns data asynchronously, so I can't just use a simple for loop to fetch all the data.
I assume there is a way to do this using an event handler, but looking at the default one, it goes way above my head.
Try using IQML (Matlab connector to IQFeed), which runs in Matlab and connects directly to IQFeed. IQML supports both blocking (synchronous snapshot) and non-blocking (asynchronous streaming) queries.
In answer to the OP question, here's an example of fetching historic IQFeed data synchronously (i.e., blocking) into Matlab using IQML:
>> data = IQML('history', 'symbol','IBM', 'dataType','day')
data =
100×1 struct array with fields:
Symbol
Datestamp
Datenum
High
Low
Open
Close
PeriodVolume
OpenInterest
>> data(1)
ans =
Symbol: 'IBM'
Datestamp: '2017-10-10'
Datenum: 736978
High: 148.95
Low: 147.65
Open: 147.71
Close: 148.5
PeriodVolume: 4032601
OpenInterest: 0
IQML supports the entire IQFeed API, including:
Both blocking (synchronous snapshot) and non-blocking (asynchronous streaming) data queries
Live Level1 top-of-book market data (quotes and trades)
Live Level2 market-depth data
Historic, intra-day and live market data (individual ticks or interval bars)
Fundamental info on assets
Options and futures chains lookup (with latest market data and Greeks)
Symbols and market codes lookup
News headlines, story-counts and complete news stories, with user-specified filters
Ability to attach user-defined Matlab callback functions to IQFeed messages and market events
User-defined custom alerts on streaming market events (news/quotes/interval-bar/regional triggers)
Connection stats and programmatic connect/disconnect
Users can combine all of the above functionality for a full-fledged end-to-end automated trading system using plain Matlab.
IQML works on all recent Matlab/IQFeed releases and platforms (Windows, Linux, Mac).
It is reliable, easy-to-use, and lightning-fast (including optional parallelization). IQML comes with a detailed User Guide packed with usage examples, sample Matlab scripts, and implementation tips.
IQML needs only the core Matlab to run - no toolboxes are required (parallelization uses the Parallel Computing Toolbox, but IQML runs well even without it).
Yair Altman
IQML.net, https://UndocumentedMatlab.com/IQML, https://github.com/altmany/IQML

How to get RAW IMU data from the Google Glass?

I am trying to get RAW acceleration and gyroscope data from the Google Glass IMU module.
I tried aSensormanager module in the NDK, but it sometimes gives output with some wierd bias adjustment. So I tried to read from the driver's virtual file-system.
When I try to read in one-shot mode from /sys/bus/iio/devices/iio:device0/, I am getting the raw data, but sometimes it misses some data (compared to the sensor manager output).
When I try to read in burst mode from /sys/bus/iio/devices/iio:device0/, it shows "/dev/iio:device0: Device or resource busy"
So, is there any alternative way to read the raw data without losing any data, or can I configure the sensor manager to give raw data without any bias adjustment?

iOS: Reading voltage value from a BTLE device

I have developed my own bluetooth low energy profile in order to read voltages from a bunch of light sensors. The only issue is that on my iPhone I just cannot find the correct way of reading that data from the GATT database.
I connect fine and discover the characteristic without a problem, but just don't know how to read the data correctly.
I am working on a BLE112 and the data should be in the form of 4 bytes. The code I am currently trying to use is:
[sensorZeroCharacteristic.value getBytes:&val length:sizeof(val)];
result = (CGFloat)val;
(val is a uint32_t)
but this gives some really large (yet consistent) numbers, so what am I doing wrong?