TensorFlow.js: Saving different model instances during training - neural-network

I'm runing TensorFlow.JS on NODE and I'd like to be able to save a model during the training process at a certain point.
I tryed to just copy the actual model to a global variable but the JavaScript object is copied by reference and at the end the global variable has the same model than the last training epoch.
I then used many different JavaScript methods to do a deep clone (including lodash deep clone), but I get errors on the copied model like functions that end up missing (like model.evaluate).
I wonder if the only way I can save a certain checkpoint is directly using model.save() or if there is any other way to just copy (by value not reference) the model object to a global or a class property.
Thanks in advane!
** UPDATE **
Right now the best solution that has worked for me is by creating a copy of the model:
const copyModel = (model) => {
const copy = tf.sequential();
model.layers.forEach(layer => {
copy.add(layer);
});
copy.compile({ loss: model.loss, optimizer: model.optimizer });
return copy;
}
Consider that you may need to replicate some other settings from the original model to the new one (the copy).

A tf.Model object contains weight values, which usually live on the GPU
(as WebGL textures) and are not easily clonable. So it's not a good idea to
clone a tf.Model object. You should serialize it and save it somewhere.
There are two options:
If you are in Node.js, you should have relatively ample storage space. Just
use Model.save() to "snapshot" models onto the disk, which can be loaded back
later.
If you prefer to avoid going through the filesystem, you can do the serialization and deserialization in memory. Using methods tf.io.withSaveHandler and tf.io.fromMemory(). See the example below:
const tf = require('#tensorflow/tfjs');
require('#tensorflow/tfjs-node');
(async function main() {
const model = tf.sequential();
model.add(tf.layers.dense({units: 1, inputShape: [3], useBias: false}));
model.compile({loss: 'meanSquaredError', optimizer: 'sgd'});
const xs = tf.randomUniform([4, 3]);
const ys = tf.randomUniform([4, 1]);
const artifactsArray = [];
// First save, before training.
await model.save(tf.io.withSaveHandler(artifacts => {
artifactsArray.push(artifacts);
}));
// First load.
const model2 = await tf.loadModel(tf.io.fromMemory(
artifactsArray[0].modelTopology, artifactsArray[0].weightSpecs,
artifactsArray[0].weightData));
// Do some training.
await model.fit(xs, ys, {epochs: 5});
// Second save, before training.
await model.save(tf.io.withSaveHandler(artifacts => {
artifactsArray.push(artifacts);
}));
// Second load.
const model3 = await tf.loadModel(tf.io.fromMemory(
artifactsArray[1].modelTopology, artifactsArray[1].weightSpecs,
artifactsArray[1].weightData));
// The two loaded models should have different weight values.
model2.getWeights()[0].print();
model3.getWeights()[0].print();
})();

Related

Migrating lowpass filter from scriptProcessor (onaudioprocess) to AudioWorkletProcessor (process)

I'm facing an issue while migrating my library from the deprecated scriptProcessor to AudioWorklet.
Current implementation with ScriptProcessor
It currently use the AudioProcessingEvent, inputBuffer property, which is an AudioBuffer.
I apply to this inputBuffer a lowpass filter thanks to OfflineAudioContext then analyze the peaks (of bass frequencies) to count and compute BPM candidates.
The issue is that the lowpass filter work can't be done within the AudioWorkletProcessor. (OfflineAudioContext is not defined)
How to apply a lowpass filter to the sample provided by the process method of an AudioWorkletProcessor (the same way as it's doable with the onaudioprocess event data) ? Thanks
[SOLUTION] AudioWorklet implementation
For the end user it will looks like this:
import { createRealTimeBpmProcessor } from 'realtime-bpm-analyzer';
const realtimeAnalyzerNode = await createRealTimeBpmProcessor(audioContext);
// Set the source with the HTML Audio Node
const track = document.getElementById('track');
const source = audioContext.createMediaElementSource(track);
// Lowpass filter
const filter = audioContext.createBiquadFilter();
filter.type = 'lowpass';
// Connect stuff together
source.connect(filter).connect(realtimeAnalyzerNode);
source.connect(audioContext.destination);
realtimeAnalyzerNode.port.onmessage = (event) => {
if (event.data.message === 'BPM') {
console.log('BPM', event);
}
if (event.data.message === 'BPM_STABLE') {
console.log('BPM_STABLE', event);
}
};
You can find the full code under the version 3 (pre-released right now).
You could make sure to apply the lowpass filter to the signal before it reaches the AudioWorkletNode. Something like this should work.
const biquadFilterNode = new BiquadFilterNode(audioContext);
const audioWorkletNode = new AudioWorkletNode(
audioContext,
'the-name-of-your-processor'
);
yourInput
.connect(biquadFilterNode)
.connect(audioWorkletNode);
As a result your process() function inside the AudioWorkletProcessor gets called with the filtered signal.
However I think your current implementation doesn't really use a lowpass filter. I might be wrong but it looks like startRendering() is never called which means the OfflineAudioContext is not processing any data.
If that is true you may not need the lowpass filter at all for your algorithm to work.

Ionic 4 how to show data time wise

I need help i have list of message in which there are 3 fields. message, uid & time.
Its showing data but its mixed like when we enter data some goes in middle of array some goes in start some in end like this.
But i need to show time wise mean the latest or newest time off array will come first or in end like this. Attach the data format also. This is simply how i am getting message from service
getAllMessages(){
this.messageService.getAllMessages().subscribe((data)=>{
this.data = data;
console.log(this.data);
});
}
You need to use a pipe and map operator for this (map needs to be imported), that will then leverage sort method:
import { map } from 'rxjs/operators';
...
getAllMessages() {
this.messageService.getAllMessages().pipe(
map(messages => messages.sort((a:any, b:any) => b.time.seconds - a.time.seconds)
).subscribe(messages => this.data = messages)
}
I assumed you have seconds already so you do not have to transform date obj of any sort in this example. If it is not the case you can adjust this code accordingly I hope.

Understanding merge in rxjs6 (and redux-observable specifically)

Context: I'm trying to use redux-observable with rxjs v6 to trigger multiple actions after fetching a resource.
Problem: I can trigger a single action after fetching the resource no problem; but my attempt to trigger multiple actions has me stumped. Specifically, my attempt shown below to use rxjs merge to trigger multiple actions seems so standard and close to documented examples (see e.g. here) that I can't understand where I'm going wrong. Here is a breakdown of my code with typescript types indicated in comments; I've simplified things for the sake of problem clarity:
import { from } from "rxjs";
import { merge, mergeMap, map } from "rxjs/operators";
import { AnyAction } from "redux";
... //Setup related to rxjs-observable epics
function myEpic(
action$: Observable<AnyAction>,
state$: any,
{fetchPromisedStrings}: any
) {
return action$.pipe(
ofType('MY_ACTION'),
mergeMap(
action =>
{
const demoAction1: AnyAction = {type:'FOO1', payload:'BAR1'};
const demoAction2: AnyAction = {type:'FOO2', payload:'BAR2'};
const w = from(fetchPromisedStrings()) //const w: Observable<string[]>
const x = w.pipe( map(() => demoAction1)); //const x: Observable<AnyAction>
const y = w.pipe( map(() => demoAction2)); //const y: Observable<AnyAction>
//My attempt to merge observables:
const z = merge(x,y); // const z: OperatorFunction<{}, {} | AnyAction>
// return x; // Works :)
// return y; // Works :)
return z; // Doesn't work :(
}
)
);
}
This code gives me the following error when I try to return z:
TypeError: You provided 'function (source) { return source.lift.call(_observable_merge__WEBPACK_IMPORTED_MODULE_0__["merge"].apply(void 0, [source].concat(observables))); }' where a stream was expected. You can provide an Observable, Promise, Array, or Iterable
So what's the problem here? I'd expect merge in the above code to take the two Observable<AnyAction> types and return a single Observable<AnyAction> type (or some more general type that's compatible with Observable<AnyAction>, which my epic needs in order to function correctly). Instead, I get some opaque type called OperatorFunction<{}, {} | AnyAction> that, evidently, isn't compatible with Observable<AnyAction>. Could someone please unmuddle my thinking here? Is merge not the operator I am supposed to use here, or is my entire rxjs-v6 pattern wrong for the (seemingly) simple goal of triggering multiple actions after fetching a promised resource?
Your code is ok, but you've imported merge operator instead of merge observable factory
Just do:
import { merge } from 'rxjs';

Using JSZip to extract multiple KML files for Leaflet VectorGrid

The map uses KML files to generate a single geoJSON object to pass to VectorGrid's slicer function. To improve performance, the files are served as a single KMZ and extracted using the JSZip library. We then loop through each file (KML), parse it and convert to geoJSON. The features are concatenated to a separate array which is used to create a final geoJSON object (a cheap way of merging).
var vectorGrid;
JSZipUtils.getBinaryContent('/pathto/file.kmz', function (error, data) {
JSZip.loadAsync(data).then(function (zip) {
var featureArray = [];
zip.forEach(function (path, file) {
file.async('string').then(function (data) {
// convert to geoJSON, concatenate features array
featureArray = featureArray.concat(geoJSON.features);
}
}
var consolidatedGeoJSON = {
'type': 'FeatureCollection,
'features': featureArray
};
vectorGrid = L.vectorGrid.slicer(consolidatedGeoJSON, options);
}
}
The idea was that once that operation was complete, I could take the final geoJSON and simply pass it to the slicer. However, due to the nature of the promises, it was always constructing the slicer first and then parsing the files after.
To get around this, I was forced to put the slicer function inside the forEach, but inside an if statement checking if the current file is the last in the zip. This allows the vectors to be drawn on the map, but now I can't enable a hover effect on each layer separately (each KML contains a specific layer used as an area outline for interaction).
The vectorGrid slider options allows you to specify a getFeatureId function, but I don't understand how to pass that id to the setFeatureStyle function in the event handlers.
The basic problem is that you try to assign value to vactorGrid before you assigned value to featureArray. I think that you need to use Promise.all(..). Something like that:
var zips=[];
zip.forEach(function(path,file) {
zips.push(file.async('string');
});
Promise.all(zips).then(function(data){
return data.map(function(value){
return value.features;
});
}).then(function(featureArray) {
vectorGrid = L.vectorGrid.slicer(
{type:'FeatureCollection',feature:featureArray}, options);
});

Export data to CSV in server pagination / sorting / filtering mode

I am trying to export the ag-grid data to CSV.
The issue is, it exports only the visible data OR the in-memory data received from HTTP call while considering paginationPageSize, maxBlocksInCache, cacheBlockSize etc. in the grid. Not the entire data-set.
I went through below links, but couldn't get much help.
[export] Export to CSV all pages in Client side
Pagination
agGrid
data
export
Is there any way we can achieve this? Or this is altogether not possible?
This is how i solved this -
fetch all rows you need from your data source
clone gridapi object
grab the serverside cache from the cloned gridapi
process it so its filled with your feched data
run export to excel method on the cloned gridapi
...
PROFIT
const gapi = cloneDeep(this.gridApi); // clone gridApi
const blocks = gapi['serverSideRowModel'].rootNode.childrenCache.blocks; // object notation to suppress private warning/err
// swap rows cache with fetched data
for (let i = 0, j = 0; i < Math.ceil(results.length/this.paginationPageSize); i++) {
// we alter relevant block, or if it is not loaded yet we clone 1st one and alter it
const block = blocks[i] || cloneDeep(blocks[0]);
block.rowNodes.forEach(n => n.data = results[j++]);
blocks[i] = block;
}
gapi['serverSideRowModel'].rootNode.childrenCache.blocks = blocks;
gapi.exportDataAsExcel(params);