I have a .h5 file I want to upload to Matlab using the import tool for TensorFlow in matlab, like this:
layers = importKerasLayers('myModel.h5');
But I get the following error:
Option to import Keras networks containing LSTM layers is not yet
supported.
layers =importKerasLayers('myModel.h5');
I've tried this in 2018a, and apperantly all layers related to LSTM are available in this version after the toolbox is downloaded, but I keep getting the error. In this link, you can see the toolbox has support for LSTM layers, but not sure what's causing the error then.
Is there any workaround to solve this? What could be causing the error?
Your link is for R2018b documentation. This is the R2018a documentation and it shows no support for LSTM! So probably switch versions and try!
Related
I have worked out a LSTM model and would like to incorporate it into a MATLAB framework. Via the Deep Learning Toolbox the functions importKerasLayers and importKerasNetwork can be called.
Is there also a way to implement the model without the Deep Learning Toolbox?
With a new version of MATLAB, it is possible to invoke Python from MATLAB. Check this URL https://in.mathworks.com/help/matlab/matlab_external/create-object-from-python-class.html#mw_c224a09a-f56b-48e5-b9ec-145388506204
I'm not sure, what you are trying to achieve by importing keras inside MATLAB. I guess, you may want to use MATLAB for your data preprocessing activities. If that is the case, you may complete preprocessing separately and save data in to numpy or pandas format to consume from Python.
I'm currently using Keras to solve a regression problem.
And I want the weights of my layers (embedding layers).
I'm using layer.get_weights() from keras, but it doesn't show me the full output.
Already tried to save_weights() but when I try to convert the hdf5 file it says that is empty.
Any solution?
I'll start this post by saying that I acknowledge this may not be the appropriate venue for this question, but wasn't sure where else to start. If there is a more appropriate SE channel, please feel free to suggest.
I've been using Keras for learning how to apply neural networks to different prediction problems. I'm interested in learning TensorFlow as a way to gain a deeper understanding of the inner working of these networks. Obviously, it's possible to switch the backend of Keras to TensorFlow and to use Keras as a high-level API to TensorFlow. However, is there a way to "recover" the TensorFlow code from a compiled Keras model? I'm thinking it would be extremely useful to be able to write a model that I'm familiar with in Keras, and automatically see it's "translation" to TensorFlow as a way to learn this library more quickly.
Any thoughts or suggestions would be helpful. Thanks for reading.
All that Keras is doing is to abstract both Theano and TensorFlow into one unified backend module. Then it uses the functions in the backend to implement the layers and methods you are able to use in Keras.
This in turn means that there is no compilation step involved in generating code for one particular backend. Both Theano and TensorFlow are python libraries, there is no reason for a translation step, Keras just uses the library you specify.
The best way to find out how a model in Keras is written in TensorFlow is probably to search for a simple network with the same dataset and compare examples in TensorFlow and Keras. Another way would be to read the Keras code and lookup the K.<function> in the TensorFlow backend module.
If you are interested in the platform specific code that the individual backends produce, e.g. the CUDA code, then the answer is: it depends. Both Theano and TensorFlow use temporary directories to store the code and the sources. For theano this is ~/.theano by default. But looking at this code will probably not make you any wiser in understanding neural networks and their mechanics.
how can I run this example in my pc? I don't have Nvidia graphic cards so I cannot use Cuda in Matlab.
I need to do it with Matlab because half of the my code is written in Matlab and all variables are in Matlab format.
My PC has ATI Radeon HD 4530 graphic card.
I read this page, but it is still confusing to understand which one is suitable.
Update1: I want to Train a deep neural network for image classification. A task similar to this example.
Update2: When I run the code mentioned in Update1, it gives me following error:
There is a problem with the CUDA driver or with this GPU device. Be sure that you have a supported GPU and that the
latest driver is installed.
Error in nnet.internal.cnn.SeriesNetwork/activations (line 48)
output = gpuArray(data);
Error in SeriesNetwork/activations (line 269)
YChannelFormat = predictNetwork.activations(X, layerID);
Error in DeepLearningImageClassificationExample (line 262)
trainingFeatures = activations(convnet, trainingSet, featureLayer, ...
Caused by:
The CUDA driver could not be loaded. The library name used was 'nvcuda.dll'. The error was:
The specified module could not be found.
Yes you can. You will have to create DLL's and use OpenCL. Look into S-Functions and Mex.
Check the documentation
There are third party tools that you may be able to use. I personally have never tried it.
Possible Tool
MatConvNet ->
Work both on CPU and GPU.
MatConvNet is a MATLAB toolbox implementing Convolutional Neural Networks (CNNs) for computer vision applications. It is simple, efficient, and can run and learn state-of-the-art CNNs. Many pre-trained CNNs for image classification, segmentation, face recognition, and text detection are available.
Another option: Caffe in general and Openmp variant of caffe in particular support Matlab and work both on CPU and GPU
I was wondering if anyone has managed to use the OpenCV implementation of Latent SVM Detector (http://docs.opencv.org/modules/objdetect/doc/latent_svm.html) successfully. There is a sample code that shows how to utilize the library but the problem is that the sample code uses a ready-made detector model that was generated using MatLab. Can some one guide me through the steps on how to generate my own detector model?
The MATLAB implementation of LatSVM by the authors of the paper has a train script called pascal. There is a README with the tarball explaining its usage:
Using the learning code
=======================
1. Download and install the 2006-2011 PASCAL VOC devkit and dataset.
(you should set VOCopts.testset='test' in VOCinit.m)
2. Modify 'voc_config.m' according to your configuration.
3. Start matlab.
4. Run the 'compile' function to compile the helper functions.
(you may need to edit compile.m to use a different convolution
routine depending on your system)
5. Use the 'pascal' script to train and evaluate a model.
example:
>> pascal('bicycle', 3); % train and evaluate a 6 component bicycle model
The learning code saves a number of intermediate models in a model cache
directory defined in 'voc_config.m'.
For more information, visit the authors website. The page also contain the paper of this method.