What does "requestTime" number mean in Chrome DevTools? - google-chrome-devtools

According Chrome DevTools Protocol viewer the value of requestTime is a baseline in seconds. To understand it I took this value from few web pages and for all of them the value of requestTime was unexpectedly large. For example one of them was 13133423 seconds. Does anyone know why the value of requestTime is too large? And what requestTime value mean?

Quoting the source code:
We want to present a unified timeline to Javascript. Using walltime is
problematic, because the clock may skew while resources load. To prevent
that skew, we record a single reference walltime when root document
navigation begins.
All other times are recorded using
monotonicallyIncreasingTime().
When a time needs to be presented to
Javascript, we build a pseudo-walltime using the following equation
(m_requestTime as example): pseudo time = document wall reference + (m_requestTime - document monotonic reference)
All values from monotonicallyIncreasingTime(), in base::TimeTicks.
More info on monotonicallyIncreasingTime: https://stackoverflow.com/a/39634132

Related

in web-audio api how to obtain an array(eg. FLOAT32 array) from a stream (eg a microphone stream) for several seconds

I would like to fill an array from a stream for around ten seconds.{I wish to do some processing on the data)So far I can:
(a) obtain the microphone stream using mediaRecorder
(b) use analyser and analyser.getFloatTimeDomainData(dataArray) to obtain an array but it is size limited to only a little over half a second of data.I can also successfully output the data after processing back onto a stream and to outDestination.
(c) I have also experimented with obtaining a 'chunks' array from mediaRecorder directly but the problem then is that I can't find any mime type that would give me a simple array of values - ie an uncompressed sample by sample single channel set of value - ie a longer version of 'dataArray' in (b).
I am wondering if I am missing a simple way round this problem?
Solutions I have seen tend to use step (b) and do regular polls then reassemble a longer array - however it seems the timing is a bit tricky ..
I'v also seen suggestions to use audio workouts - I might have to do this but would prefer a simpler solution!
Or again, if someone knows how to drive mediaRecorder to output the chunks array in a simple array format FLOAT32.of one channel.That would do the trick.
Or maybe I'm missing something simpler?
I have code showing those steps that have been successful and will upload if anyone requests.

Time Signature Meta Message in MIDI

I am working on a MIDI project using mido library in Python. I see in the manual a meta message for time signature with value: notated_32nd_notes_per_beat which has a default value of 8.
<meta message time_signature numerator=4 denominator=4 clocks_per_click=24 notated_32nd_notes_per_beat=8 time=0>
Which makes sense. However, can I define it like:
<meta message time_signature numerator=4 denominator=4 clocks_per_click=24 notated_32nd_notes_per_beat=32 time=0>
Does this increase the display resolution when shown in a score/typesetting software? What is the usage of this please?
time_signature (0x58) meta message in midi files
The file header specifies the number of ticks per beat, and the tempo messages specify the length of a beat, in microseconds. These value are needed to correctly play back the file.
The last field of the time signature message specifies how the tick values in the MIDI file relates to notes in a score. It does not affect at what time events are sent (so a pure playback program will ignore this message), but how notes are displayed.
For example, if the header says there are 100 ticks per beat, and the time signature has the default of 8 32th notes per beat, then a note-on/note-off pair with a distance of 100 ticks is displayed as a quarter note. If you change the time signature to 32 32th notes per beat, then a length of 100 ticks corresponds to a whole note.

Filtering an audio signal and then reading the meter without sending it to master

I'm trying to filter a signal and then analyse the values of the filtered signal using Tone.js / Web-Audio API.
I'm expecting to get values of the filtered signal, but I only get -Infinity, meaning that my connections between the nodes are wrong. I've made a small fiddle demonstrating this, however in my use-case I do not want to send this node to the destination of the context - I only want to analyse it, not hear it.
osc.connect(filter)
filter.connect(gainNode)
gainNode.connect(meter)
console.log(meter.getLevel())
I guess you tested the code in Chrome because there is a problem with Chrome which causes it to not process anything until it is connected to the destination. When using Tone.js that means you need to call .toMaster() at the end of your chain. I updated you fiddle to make it work: https://jsfiddle.net/8f7abzoL/.
In Firefox calling .toMaster() is not necessary therefore the following works in Firefox as well: https://jsfiddle.net/yrjgfdtz/.
After some digging I've found out that I need to have a scriptProcessorNode - which is apparently no longer recommended - so looking into Audio Worklet Nodes

libspotify C sending zeros at the end of track

I'm using libspotify SDK, C library for win32.
I think to have a right setup, every session callback is registered. I don't understand why i can't receive the call for end_of_track, while music_delivery continues to be called with zero padding 22050 long frames.
I attempt to start playing first loading the track with sp_session_load; till it returns SP_ERROR_IS_LOADING I post a message on my message queue (synchronization method I've used, PostMessage win32 API) in order to reload again with same API sp_session_load. As soon as it returns SP_ERROR_OK I use the sp_session_play and the music_delivery starts immediately, with correct frames.
I don't know why at the end of track the libspotify runtime then start sending zero padded frames, instead of calling end_of_track callback.
In other conditions it works perfectly: I've used the sp_track obtained from a album browse, so the track is fully loaded at the moment I load to the current session for playing: with this track, it works fine with end_of_track called correctly. In the case with padding error, I search the track using its Spotify URI and got the results; in this case the track metadata are not still ready (at the play attempt) so I used that kind of "polling" on sp_session_load with PostMessage.
Can anybody help me?
I ran into the same problem and I think the issue was that I was consuming the data too fast without giving other threads time to do any work since I was spending all of my time in the music_delivery callback. I found that if I add some throttling and notify the main thread that it can wake up to do some processing, the extra zeros at the end of track is reduced to one delivery of 22,050 frames (or 500ms at 44.1kHz).
Here is an example of what I added to my callback, heavily borrowed from the jukebox.c example provided with the SDK:
/* Buffer 1 second of data, then notify the main thread to do some processing */
if (g_throttle > format->sample_rate) {
pthread_mutex_lock(&g_notify_mutex);
g_notify_do = 1;
pthread_cond_signal(&g_notify_cond);
pthread_mutex_unlock(&g_notify_mutex);
// Reset the throttle counter
g_throttle = 0;
return 0;
}
As I said, there was still 22,050 frames of zeros delivered before the track stopped, but I believe libspotify may purposely do this to ensure that the duration calculated by the number of frames received (song_duration_ms = total_frames_delivered / sample_rate * 1000) is greater than or equal to the duration reported by sp_track_duration. In my case, the track I was trying to stream was 172,000ms in duration, without the extra padding the duration calculated is 171,796ms, but with the padding it was 172,296ms.
Hope this helps.

Editing Timeline from CCB file in cocos

I did some research into this and couldn't really find anything, so if this is a repetitive question I apologize. but anyway I have made a CCB file in CocosBuilder and I would like to start the timeline, for example, at one second instead of playing from the beginning. Is there a way to do this? Thanks for the help guys.
Edit: i would like this to be done in the code.
I am using 2.2.1 Cocos2DX version. I think there is no option to play it from given interval. But you can tweak yourself to get it done. (Not simple one)
You have to go to CCBAnimationManager and there you get "mNodeSequences".
It is dictionary and you get difference properties there like "rotation position etc..."
values there.
Internally AnimationManager reads this value (These values are specified in your CCB)
and puts in runAction queue.
So you have to break it as you want.(Ex. 5 min timeline you have. But you want to start
from 1 min then you have run first 1 min Actions without delay and for remaining you
have properly calculate tween intervals.
It's long procedure and needs calculation. If you don't know any other simpler way try this. If you know pls let us know (Post it).