MATLAB mfcc gmdistribution fit for Speech Recognition Program - matlab

I'm new to Matlab and doing a signal processing project(Speech Recognition). After doing some calculations, I get some values known as MFCC (Mel-Frequency Cepstral Coefficient) in a matrix. I'm now supposed to apply a Gaussian Mixture Model (GMM) distribution using the function gmdistribution.fit(X,k). But I keep getting the error,
X must have more rows than columns.
I don't understand, how can I fix this? I tried doing a transpose of the matrix, but then, I get other errors.
??? Error using ==> gmcluster at 181
Ill-conditioned covariance created at iteration 3.
Error in ==> gmdistribution.fit at 199
[S,NlogL,optimInfo] =...
My MFCC matrix generally has 13 rows and about 50-80 columns.
Any ideas on how to fix this? Should I be using only upto 12 cols at a time? OR what could be an alternate expectation-maximization (EM) algorithm to obtain a maximum likelihood (ML)estimate in Speech recognition?
Here's a sample matrix that I get after extracting the mfcc feature vectors from the speech:
53.19162380493035 53.04536473593154 52.52404588266867 52.76558091790412 53.63907256262721 53.357790132994836 52.73205096524416 52.902995065027056 52.61096061282659 54.15474467851871 53.67444472478125 52.64177726437717 52.51697384592561 52.71137919365186 53.092851922453896 53.16427640450918 54.43019514688636 60.79640902129941 59.84919922646779 63.15389910551327 61.88723594060794 64.74826830389657 64.8349874832628 64.86278444375218 65.76126193531795 65.64589407152897 65.46920375829764 65.69178734432299 65.28831375816117 64.56074008418904 63.4966945660873 63.81859800557705 63.72800219675504 62.48994205815299 62.170438508902436 61.06563184036766 59.13583014975035 58.81335869501639 56.32130498897641 55.13711899166046 54.013505531107796 54.15759852717166 53.44176740036524 53.13219768600348 53.03407270007307 52.88271825256845 53.822163186509016 53.53892778841879 54.04538463287215 59.485371756367954 58.48009762761471 54.643413468895346 52.808848460884654 52.87392859698496 52.42111841679119 53.2365666558251 53.30622484832905 53.1799318016215 53.784807994410315 53.248067707554924 52.69122098296521 52.50131276155125 53.43030515391315 53.902384536061604 54.029570128176985 52.842675820980034 52.79731975873874 53.18695701339912
-10.209801833131205 -9.680631918902254 -9.62767876068187 -11.100788671331799 -12.214764051532008 -10.968305830999338 -9.860973825750351 -9.865056435511548 -10.658715794299441 -9.3596215435813 -11.6646716335442 -11.73183849207276 -12.378134406457027 -10.926012890327158 -11.620321504456165 -10.158285684702548 -9.264017760124812 -3.477686356268614 -3.34008367962826 -4.830538727398767 -2.000396004172366 -4.4851181728969225 -2.9033880784025152 -4.367902167404347 -4.497084603581041 -5.199683464056032 -5.906443970301479 -6.1194300184632855 -5.96250940992931 -6.359811770556116 -6.264817939973589 -4.895405335125048 -5.356838360441918 -6.327382452484718 -6.680325151391659 -6.17848037726304 -5.4759013940523245 -1.9841026636312946 -4.076294540940979 -7.824603409725002 -5.800269620602235 -8.01263214623702 -11.425250071230579 -10.277472714265365 -10.774573945280718 -11.322162485376891 -10.052477908307408 -10.004482396755566 -8.557096237262265 -7.319189335399103 -4.798868632345757 -10.203105092807693 -10.406716632774856 -11.067414745093817 -11.699111553041329 -10.749597806292954 -10.555273429092225 -8.854304279940754 -10.903698849240602 -10.234951031082241 -11.550994106255267 -11.295232804215324 -10.688554946454785 -9.208980407123816 -10.585845595336993 -10.757300448605834 -10.319608162526984 -10.551598424355781
-0.18311276580153307 -1.3000235617058096 -2.379404485976171 0.8537711039288245 0.7835891293988151 -0.786100291329253 1.0107138900981782 -0.12469382941718324 -2.2952791566222173 -0.8251663787748776 -0.050658777310996696 -4.6807361290865295 -3.3756455575107784 0.38895610612101605 -0.9962664893365839 -1.3680101462804826 -0.7328675082528926 14.930618844131613 11.172961105935304 16.974801313922335 13.375385369069916 14.024700863057664 14.594849346714536 17.610029847404075 16.601731375214815 15.581203919095396 15.429198596491359 15.842389728372694 16.162847697063377 17.262648834400064 18.2608582394078 19.38844125300681 16.858591012785013 16.93154670795065 12.906259456599424 13.056739996060314 11.258250889980491 8.834726263239137 6.184939770895715 4.068236554570518 2.184520358080839 3.6716311416454106 0.5890504959921528 -3.0455374126328874 -1.657407892408495 0.33660057466143056 -0.40801030148804557 0.04270808730635576 3.208411924734062 5.821481390407001 4.560967865706884 -0.9575473658761547 -1.9690622742411314 -1.4335363449433605 0.5073073427521086 1.8313651620152203 -2.1659200593772345 1.2769675752335854 -2.2873258303700696 -0.030049578085935582 -2.002440722711317 -2.3424337647822346 -4.259810095095228 -0.9747655920995262 0.09482704525635513 -0.2885341356828254 1.439149953470075 0.6807611595304401
2.087244713218005 -3.787403802296573 -4.665688240227797 0.46022874550890147 -0.16943798737784035 -2.7170563621342785 -1.7464303367036695 -3.27442943105816 -3.6318990907200597 -1.1574346481702122 -1.0207450052082863 -5.838249114276465 -4.864029691290982 -2.7443279494466704 -1.3475670289669839 -0.71926223394222 -1.7145131082739746 10.695036462762722 10.398176627688748 11.642258160333318 8.67660434911699 13.223576542483247 14.470121526018994 14.100543157086074 13.22291384069529 11.67823582796623 13.466476916853203 13.535357097626715 14.875339057135838 14.37083096189283 13.33673313953938 12.329553090328996 9.676373050790103 11.448653427990415 9.874926564656558 7.147530590070999 10.29584390330658 10.101141207939456 5.283325337013565 4.507665609590605 3.1555597807254223 1.176891149051998 -0.2017066100725112 -2.5074705794245427 3.7132131484813073 0.9607407688505634 -3.2742739297063865 -6.602070936837743 -2.2912280318564378 10.190482148210974 10.157945177713376 -0.09147003586407224 -5.244432802624313 -1.2872483780850776 -3.7378553488851147 2.853534940706138 -2.9599246290596257 -1.2759697907404983 -2.609173347676013 -0.027021884588768103 -2.3092682012995387 -1.4002697262020989 -4.192442987678205 -0.11708538059933485 -1.722764980370641 -0.8528543327485958 0.36818682029243044 -1.5833959315094956
-1.2340033668089612 -2.7554310519289933 1.4704457874837413 -0.4125243211298726 1.7297567688324673 3.721374587353874 -2.2232745236466402 -1.0295891117338212 1.021098021933131 -3.392544522126444 1.3301447592375433 -0.30182589581098784 -2.2645887723031413 0.5179073904608001 2.0537130718040917 -3.030349632233867 -2.107849434880047 -7.949976055283274 -5.172658838436902 -7.2904509401269575 -6.1323858833603815 -2.37546696444418 -2.6620539778383723 -3.5795807500300305 -4.687709564035536 -1.7454933814935076 -0.6827757483935794 0.23687223893178067 2.8267871613253077 3.5866135581831227 3.142665641927276 4.095262325494299 3.871285159350548 3.8703187080829764 3.8314236250858555 1.798983626211966 0.725468180389042 0.11919814479647405 2.7173707003940124 6.868690477210499 6.270964718280218 2.3176609494750564 2.0733820130334926 -0.8539453920978304 3.48931978155834 -2.6098957232427957 0.7925129692289851 -2.482250690121881 -1.9255950956807195 -3.3296568338000525 -2.5852039200206076 0.7513494304110043 1.6119079892129162 0.8581457406304087 1.4037071284373093 -3.163651849398714 5.052978402873416 2.4518824480379813 0.027602305580521395 0.7477958990121767 0.9232542431737198 -0.5545479544994354 -3.4480660326803503 1.0747263160741485 -4.078097840161742 4.485742151839941 0.1658605159666291 0.1722930547996016
-1.6428664752690114 3.7865726986742145 2.5318491820052564 -2.1947219298888676 -2.1237775233625986 2.598630953202959 -6.076201524281277 -5.315246911864284 -1.5747455209374586 -3.223379488606859 2.6008295264581776 1.3270506534986315 -2.5790744715346676 0.7756431623687378 3.0553271757777356 -0.20800002044634847 -1.530027153710214 -2.207970121996219 -1.8813636939941347 -2.685201388968379 -1.2497372042225408 2.5726591149003712 1.4779209530617206 0.18848939011950389 -0.8737068656038859 4.364271583896629 2.0338276700410187 4.017665258617117 2.929288856255161 10.031463178073729 7.807148474194119 8.930649791195147 9.356704480964387 4.682860624638529 3.9421955431659375 3.46979114616638 0.10907941624689588 1.013539556043216 1.380950812959332 1.077296756517698 4.643176114193134 0.276532579753215 1.3247848485761091 -1.6452351331258643 5.459080479943587 -2.623903958160855 -3.6495250981385525 0.30098983943901886 1.2192582165344557 3.9341748890807207 3.8902438441040768 2.3070835920696586 -2.692501110699399 1.6807838025217028 1.5259881694196216 0.3750392433389195 5.708674336592535 -1.1571072509634228 -1.9909829706185518 -2.911287549300028 -4.934348834333174 -2.258176779559039 0.17624511060134188 0.02295826196619305 -3.516972940169973 5.184345513656031 1.4594074325337887 -0.19794455729474633
2.362306464828889 1.8140886321872307 3.105122487428386 -2.452729932993756 -1.9482153346221507 0.23556664481369372 1.0605939999557794 9.466891504042334 4.485454438679325 2.6792667132201102 -0.7696085536288818 1.1799363148487811 -4.770207147524265 0.7773255533610134 -1.0253054017942649 5.364238239319841 3.1331011184169473 4.744685304867839 -0.052537238369118014 4.477806263589113 3.1539530991186067 6.4185233259645385 2.549990446321861 2.4829837421356564 4.089323590949597 7.9396405004582045 6.041498345508568 9.234608707932582 7.3843205505399885 10.495371462065135 15.043508733932194 8.70736248600434 13.199534350054295 9.807690741908354 9.182134815924455 12.06839623216329 7.974743468866006 12.349726591545481 5.750367027892127 -0.6482940009399485 5.4638120941442185 1.856389413910232 1.9530813300592067 -2.8701346921179733 1.558852931425583 -0.19366384484174437 -2.6386457918474457 1.4662219452543457 2.079641671534525 15.326629935694294 14.705559998054612 -0.06282946858494885 -1.827803410621235 3.114649202395378 0.3720781976421628 0.43011998686353536 -3.376799358785071 -1.5552531679484054 3.060902156478365 3.5360394473034553 -2.3908283396567356 0.6675611086499327 0.22711502816964574 -6.457828495248154 -0.6807474446526474 -0.6230980701736715 2.2692316872172476 -0.979235567032777
2.306823535295793 3.4952484194762055 5.910905884417197 -3.0627994884681873 -3.2217585242174294 -0.015187803494101149 -0.9514287527346498 3.114431724585367 0.42923281798814705 -3.189859804015462 -1.472673603923648 -3.036867739556342 -0.15973786580917693 -0.0905525722541792 2.330382174351248 2.7439958525955515 0.3730263667251821 -12.515523622378907 -13.343548342714616 -11.536760383050373 -8.307383651556634 -15.660481772806875 -14.155076207607415 -14.343032997039627 -11.791205489191787 -14.964231411185601 -13.183950294156357 -8.972526839374074 -5.366478645304655 -10.910217774510665 -1.5480767893424763 -8.888577773693916 -2.6255911360834023 -5.8588628908556695 -4.145564000313309 -2.984375697431632 0.8831077064431804 -5.243824833303439 5.196626588048474 6.352837095147023 1.2112116324076188 -2.9147691775934286 -2.6935780565318352 -2.810972986669758 4.9399646272914275 -1.1703117105056318 -2.402532372315127 -4.8461309660884675 -7.261524451953783 -2.5282219889051856 -1.0065282601086587 -2.5563997598612156 -4.351683980269447 -0.46252498899381495 -5.890633052969005 -0.3032076532083649 -0.6457938679695084 -0.455043482005029 3.359840875612215 -1.7228176367513395 -3.168976094613273 -2.5233843488620917 -6.495499983402964 -3.4972987525688515 0.7115283186290751 -2.581097605905542 0.6315410714331887 0.19502062594451325
1.2870172739850947 2.713157481924801 -0.5205380954882455 -4.658525381198428 -0.10827507866220412 2.4486415136057875 -0.2640204926534809 -0.09970608992954652 1.5082258768440102 -0.48148890836461583 6.911722876338505 -1.839425896561688 -3.841669694063511 -4.524554996776859 -0.9323811218879002 -6.12813923896959 -2.617633134059251 -6.309717724130619 -3.909047191185573 -6.705305972326263 -3.194505292603528 -7.893721876340621 -0.7610949447938617 -0.6090909340423546 1.4581855733113227 -2.41596099072141 -3.8541389118806912 -1.927700181895679 4.665459793274741 -2.132645903487048 4.157947245063189 0.11326683589817262 -1.162075689787945 1.055761599597126 3.298475882289032 0.9391848013866494 5.223274229835592 5.199193224601442 6.24812913948699 5.2190463423872515 1.5179114498579496 -0.6790185492512775 -0.31373376397636593 -3.5993965276962707 4.302535367682559 5.0068035330847005 -2.436072054028143 -0.8350201387276532 -2.018104375721472 0.5404586080558861 -2.428770201558009 -2.335732881592787 -0.052034561490399235 2.6353099398265676 -2.99995676341149 1.7399565653589897 -0.29483744276382473 4.957413374961816 5.6898464888615 -4.002464222625706 0.966133847419872 2.170532357744949 -2.4172124815273173 -5.913083394982123 -0.22652498917043715 0.138040634076645 -2.826152803587723 5.842509989192995
-6.149578124267104 -2.8288721761218962 -0.27284674336933024 -0.7388702321118317 6.111878602550777 3.359125556152289 1.2074835809541602 -2.229103203811113 -1.625118718284933 -0.2004222132512952 2.0932748099429754 0.712406626137792 0.43416711590137985 -5.55554193439384 -2.1786650973628827 -2.969057723871395 -6.618199451327406 2.299416281672153 5.007013248892597 -2.8033104103688027 -0.14925301159195922 -3.1533724522208697 -1.686316186073986 -0.08884837954280254 1.3265208802169017 1.3523930289041641 3.5524134648371395 -1.4254466520590146 -3.5611240333626477 1.0329276937146186 0.753052597154297 0.7975894394949765 -1.1854014340942607 1.1593797963914545 -0.8529267167794818 -5.171015036219429 4.116322136411159 -1.4483994704782983 -4.286164521201809 2.740046108799948 -3.5798763236060673 -3.018292657641495 -3.1806602684198966 -7.234273046469597 -9.434807181114692 -1.5847563989433828 -3.5635243742856346 4.782665786942992 3.5778211425622497 1.6853638633605281 1.7167799803768633 -1.6174055012561088 3.7435401900571574 -3.176593678259591 6.40495736593622 3.3331406463423483 -4.189245091250336 -1.1362166265192732 4.592859698246665 -2.8863334811724606 0.16041676401714375 4.737837256397985 -2.2744510630052366 1.4695485402180768 -4.897075450622638 1.0194864096015128 3.0757846367935398 1.489203230013674
-5.616870225243653 4.10940999519677 -0.3567822711722583 4.987855490462697 2.5632059692246143 -4.705396196410884 -0.1194996962733683 8.46869233605413 1.7788275688487483 -1.9527299063266377 -1.481085011956697 -1.0244613136295895 3.2992905241167114 -3.64385218716246 0.4426619512128128 -0.9239334997116153 -1.8620760850713798 -1.572039531941818 -10.036763755809012 -4.991528131941471 -7.136095340914314 -3.9318863449619683 -8.239368103131268 -8.443697887490892 -7.638579800501108 -8.460278636486919 2.042450826339361 -2.9885807367329646 -7.09364471308204 0.751496922690038 -0.7845673603407124 3.01935526513198 -1.39022538332522 -1.3101410638362037 -6.557786354682332 -10.172228179790066 -7.914321004354581 -5.649458806929109 2.0908760762554857 -1.4736963383710477 -1.1834278800206155 -0.6892124083994282 4.710875739605662 -3.269448539379895 -1.365967094144594 2.229881555767406 -0.9419137895352326 -0.48671864439322476 4.178896930726449 -6.953289505262448 -3.5225552311666406 -0.03841148260907753 0.14013269702442782 -6.512368259808616 1.8280649782849192 0.3454330974085145 -7.766620058704248 1.6650823954773208 9.615187994533223 3.360235349725343 0.22182808924480077 -0.30209172650913635 -1.6349262462057823 5.754809401078592 -1.6377375938940244 4.58705098784457 -2.404590707062002 0.45319882935997813
-3.730821551088958 1.1493694300690667 6.12342052964259 1.0160737493461047 4.543231805847945 -0.46099872305259204 -1.5594323941163388 10.090773095751917 5.028250117132579 1.5903687490782517 0.5749808655709501 -4.492674335179201 2.325703447395548 5.206408565021089 -4.9872461967223565 -6.549149325309605 2.90139977554803 -3.116490551862926 -8.703818668102071 -0.6313375630613844 -1.3155034176934333 1.1556044127857454 -0.9275062964334158 2.1324193244502876 3.430145051864411 4.086699745467884 5.480203425684989 1.3741912885959398 3.339835767680544 5.640295156144797 1.9610369474663063 1.785080274117643 1.8291947445479142 2.966205980470809 -0.12596430958161875 4.646073914100102 -0.7648039700071241 6.3484330647888605 4.459704396949977 1.5062484187054803 -1.6168718590653306 1.7558262745105164 1.2355091938620948 -9.312287204368275E-4 -0.5174901532050828 -3.0942917590395123 2.127834965233185 2.205667503405521 1.120114080459297 -1.7595270682165296 -9.083346980110788 -1.4981626322158839 0.7146008123272161 -0.6811098332417078 0.32703395934824275 -2.555380698176684 1.7740823756697832 4.5707670000209495 1.4842964294571344 4.818614788487457 3.1215801329358515 1.4479667080737233 1.1758507462380035 6.03230783411774 2.288914057777 4.82860171466599 -1.2457175363287405 -0.5058301430711261
-2.768473705667538 0.15564719507110275 -2.6550122323991947 -5.709488621527887 0.4785386384778287 0.6814858260993006 -5.52429514744985 -0.5602195429716864 3.9723119003523184 -5.62516538263036 -4.829570651115459 -1.2950948013109767 7.302412416568166 -3.043678812305364 -3.149850274277347 -6.476944546181209 -0.5807442791158823 -4.080078654055604 -3.1611933621382597 -0.11637063086775598 1.6049131611665592 5.044497534034215 0.3838925988521055 5.778293566481567 4.058620434329893 5.927479580737815 2.489198330275847 1.3107947997423626 1.5828295303331719 0.024839158566965516 -0.5476121359730696 0.87259267290178 0.9361180475548712 -1.5960762918622518 -5.611058251792273 -0.1594321010434905 -4.760816879788385 -0.07479939429503339 -1.7483043512234622 -2.8457787380793556 -1.7121754676101464 -3.787278050262899 3.7473965097918542 -1.659644247031472 -0.09111384850703107 2.4558095815874137 -0.06434581404575994 -3.7711115877495898 -0.2647997786903864 7.047915131872554 2.696723847584077 2.0890029827477234 -1.6825745638184928 -3.5887592066629557 -1.6594244317183802 -3.1951431164448874 3.27560938604933 2.334479543234365 2.9783519550285447 4.899933974871159 -2.2328606908007633 1.600105125583785 -2.1591853807024437 5.713548445622229 2.1891014794399264 -4.680943918675132 -2.5283217348396123 -2.6580555791689666

I don't understand, how can I fix this?
You need to transpose the matrix, you got it right. The vectors must be on the raws
Any ideas on how to fix this?
GMDISTRIBUTION implements the standard Expectation-Maximization (EM) algorithm. In some cases, it may converge to a solution which contains singular or close-singular covariance matrix for one or more components. Those components usually contains a few data points almost lying in a lower-dimensional subspace. A solution with singular covariance matrix is
usually considered as spurious. Sometimes, this problem may go away if you try another set of initial values; Sometimes, this problem will always occur because of any of the following reasons:
The number of dimension of data is relatively high, but there are not enough observations.
Some of the features(variables) of your data are highly correlated.
Some or all the features are discrete.
You try to fit the data to too many components.
In your case, it seems that the number of components that you used, 8, is too big. you can try to reduce the number of components. Generally, there are also other ways that you can use to avoid getting "Ill-conditioned covariance matrix" error message"
If you don't mind to get solutions with ill-conditioned covariance matrix, you can use option 'Regularize' in the GMDISTRIBUTION/FIT function to add a very small positive number to the diagonal of every covariance matrix.
You can specify the value of 'SharedCov' to be true to use equal covariance matrix for every component.
You can specify the value of 'CovType' to be 'diagonal' .
See also
http://www.mathworks.com/matlabcentral/newsreader/view_thread/168289
Should I be using only upto 12 cols at a time?
No

#Shark
I had the same problem, trying to generate an Gaussian MM object from a set of data.
I solved it by specifing the type of covariance:
GMM1=gmdistribution.fit(X,k,'CovType','diagonal')
GMM1 is the object name. You can find the meaning of X and k in
help gmdistribution.fit
If this doesn't work for you, try specifying the initial values of the EM algorithm
that gmdistribution already uses to generate the GMM.
Elios

first of you should make it linear because it is too big for matlab to do it, after that its better to just take 7-10 features (i think you get more than this).
after you did your work then use reshape function in order to make it what you want

Related

Why is linear regression so poor in training using PYMC3

I am new to PYMC3. Maybe this is a naive question, but I searched a lot and didn't find any explanation on this issue.
Basically, I want to do a linear regression in PYMC3, however the training is very slow and the model performance on training set is very poor as well. Below is my code:
X_Tr = np.array([ 13.99802212, 13.8512075 , 13.9531636 , 13.97432944,
13.89211468, 13.91357953, 13.95987483, 13.86476587,
13.9501789 , 13.92698143, 13.9653932 , 14.06663115,
13.91697969, 13.99629862, 14.01392784, 13.96495713,
13.98697998, 13.97516973, 14.01048397, 14.05918188,
14.08342002, 13.89350606, 13.81768849, 13.94942447,
13.90465027, 13.93969029, 14.18771189, 14.08631113,
14.03718829, 14.01836206, 14.06758363, 14.05243539,
13.96287123, 13.93011351, 14.01616973, 14.01923812,
13.97424024, 13.9587175 , 13.85669845, 13.97778302,
14.04192138, 13.93775494, 13.86693585, 13.79985956,
13.82679677, 14.06474544, 13.90821822, 13.71648423,
13.78899668, 13.76857337, 13.87201756, 13.86152949,
13.80447525, 13.99609891, 14.0210165 , 13.986906 ,
13.97479211, 14.04562055, 14.03293095, 14.15178043,
14.32413197, 14.2330354 , 13.99247751, 13.92962912,
13.95394525, 13.87888254, 13.82743111, 14.10724699,
14.23638905, 14.15731881, 14.13239278, 14.13386722,
13.91442452, 14.01056255, 14.19378649, 14.22233852,
14.30405399, 14.25880108, 14.23985258, 14.21184303,
14.4443183 , 14.55710331, 14.42102092, 14.29047616,
14.43712609, 14.58666212])
y_tr = np.array([ 13.704, 13.763, 13.654, 13.677, 13.66 , 13.735, 13.845,
13.747, 13.747, 13.606, 13.819, 13.867, 13.817, 13.68 ,
13.823, 13.779, 13.814, 13.936, 13.956, 13.912, 13.982,
13.979, 13.919, 13.944, 14.094, 13.983, 13.887, 13.902,
13.899, 13.881, 13.784, 13.909, 13.99 , 14.06 , 13.834,
13.778, 13.703, 13.965, 14.02 , 13.992, 13.927, 14.009,
13.988, 14.022, 13.754, 13.837, 13.91 , 13.907, 13.867,
14.014, 13.952, 13.796, 13.92 , 14.051, 13.773, 13.837,
13.745, 14.034, 13.923, 14.041, 14.077, 14.125, 13.989,
14.174, 13.967, 13.952, 14.024, 14.171, 14.175, 14.091,
14.267, 14.22 , 14.071, 14.112, 14.174, 14.289, 14.146,
14.356, 14.5 , 14.265, 14.259, 14.406, 14.463, 14.473,
14.413, 14.507])
sns.regplot(x=X_tr, y=y_tr.flatten());
Here I use PYMC3 to train the model:
shA_X = shared(X_tr)
with pm.Model() as linear_model:
alpha = pm.Normal("alpha", mu=14,sd=100)
betas = pm.Normal("betas", mu=0, sd=100, shape=1)
sigma = pm.HalfCauchy('sigma', beta=10, testval=1.)
mu = alpha + betas * shA_X
forecast = pm.Normal("forecast", mu=mu, sd=sigma, observed=y_tr)
step = pm.NUTS()
trace=pm.sample(3000, tune=1000)
and then check the performance:
ppc_w = pm.sample_ppc(trace, 1000, linear_model,
progressbar=False)
plt.plot(ppc_w['forecast'].mean(axis=0),'r')
plt.plot(y_tr, color='k')`
Why is the prediction on the training set so poor?
Any suggestions and ideas are appreciated.
This model is doing a fine job - I think the confusion is over how to handle the PyMC3 objects (thank you for the easy to work with example, though!). In general, PyMC3 will be used to quantify the uncertainty in your model.
For example, trace['betas'].mean() is around 0.83 (this will depend on your random seed), while the least squares estimate that, for example, sklearn gives will be 0.826. Similarly, trace['alpha'].mean() gives 2.34, while the "true" value is 2.38.
You can also use the trace to plot many different plausible draws for your line of best fit:
for draw in trace[::100]:
pred = draw['betas'] * X_tr + draw['alpha']
plt.plot(X_tr, pred, '--', alpha=0.2, color='grey')
plt.plot(X_tr, y_tr, 'o');
Note that these are drawn from the distribution of "best fits" for your data. You also used sigma to model the noise, and you can plot this value as well:
for draw in trace[::100]:
pred = draw['betas'] * X_tr + draw['alpha']
plt.plot(X_tr, pred, '-', alpha=0.2, color='grey')
plt.plot(X_tr, pred + draw['sigma'], '-', alpha=0.05, color='red')
plt.plot(X_tr, pred - draw['sigma'], '-', alpha=0.05, color='red');
plt.plot(X_tr, y_tr, 'o');
Using sample_ppc draws observed values from your posterior distribution, so each row of ppc_w['forecast'] is a reasonable way for the data to be generated "next time". You might use that object this way:
ppc_w = pm.sample_ppc(trace, 1000, linear_model,
progressbar=False)
for draw in ppc_w['forecast'][::5]:
sns.regplot(X_tr, draw, scatter_kws={'alpha': 0.005, 'color': 'r'}, fit_reg=False)
sns.regplot(X_tr, y_tr, color='k', fit_reg=False);

3D transformation of matrix in Matlab

I want a Matlab's solution to the 3D-transformation/rotation of a matrix which rotate the given vector in such a way that initial points are changed to some angle but final points are same. I have a P vector to operate this scenario.
What would be the suggestions to transform this matrix into further versions (more projectiles from one parent projectile)such that it changes its degree of starting direction of projectile and having constant final position to hit the target.! I'm not very much accurate in the drawing as this in 2D but I want the same concept in 3D, here the target is the final place of projectile, which is suppoesed same iin whole scenerio, but the starting points are supposing trasformed to some angle/direction derived from parent projectile. I hope i'm clearing my case.
P vector:
P =
-21.8318 19.2251 -16.0000
-21.7386 19.1620 -15.9640
-21.6455 19.0988 -15.9279
-21.5527 19.0357 -15.8918
-21.4600 18.9727 -15.8556
-21.3675 18.9096 -15.8194
-21.2752 18.8466 -15.7831
-21.1831 18.7836 -15.7468
-21.0911 18.7206 -15.7105
-20.9993 18.6577 -15.6741
-20.9078 18.5947 -15.6377
-20.8163 18.5318 -15.6012
-20.7251 18.4689 -15.5647
-20.6340 18.4061 -15.5281
-20.5432 18.3432 -15.4915
-20.4524 18.2804 -15.4548
-20.3619 18.2176 -15.4181
-20.2715 18.1548 -15.3814
-20.1813 18.0921 -15.3446
-20.0913 18.0293 -15.3078
-20.0015 17.9666 -15.2709
-19.9118 17.9039 -15.2340
-19.8223 17.8412 -15.1970
-19.7329 17.7786 -15.1601
-19.6438 17.7160 -15.1230
-19.5547 17.6534 -15.0860
-19.4659 17.5908 -15.0489
-19.3772 17.5282 -15.0117
-19.2887 17.4656 -14.9745
-19.2004 17.4031 -14.9373
-19.1122 17.3406 -14.9001
-19.0241 17.2781 -14.8628
-18.9363 17.2156 -14.8254
-18.8486 17.1532 -14.7881
-18.7610 17.0907 -14.7507
-18.6736 17.0283 -14.7132
-18.5864 16.9659 -14.6758
-18.4994 16.9035 -14.6383
-18.4124 16.8412 -14.6007
-18.3257 16.7788 -14.5632
-18.2391 16.7165 -14.5255
-18.1526 16.6542 -14.4879
-18.0663 16.5919 -14.4502
-17.9802 16.5296 -14.4125
-17.8942 16.4673 -14.3748
-17.8084 16.4051 -14.3370
-17.7227 16.3429 -14.2992
-17.6372 16.2807 -14.2614
-17.5518 16.2185 -14.2235
-17.4665 16.1563 -14.1856
-17.3815 16.0941 -14.1477
-17.2965 16.0320 -14.1097
-17.2117 15.9698 -14.0718
-17.1271 15.9077 -14.0338
-17.0426 15.8456 -13.9957
-16.9582 15.7835 -13.9576
-16.8740 15.7214 -13.9196
-16.7899 15.6594 -13.8814
-16.7060 15.5973 -13.8433
-16.6222 15.5353 -13.8051
-16.5385 15.4733 -13.7669
-16.4550 15.4113 -13.7287
-16.3716 15.3493 -13.6904
-16.2884 15.2873 -13.6521
-16.2053 15.2253 -13.6138
-16.1223 15.1634 -13.5755
-16.0395 15.1014 -13.5372
-15.9568 15.0395 -13.4988
-15.8742 14.9776 -13.4604
-15.7918 14.9157 -13.4220
-15.7095 14.8538 -13.3835
-15.6273 14.7919 -13.3451
-15.5453 14.7301 -13.3066
-15.4634 14.6682 -13.2681
-15.3816 14.6063 -13.2295
For definition of a proper rotation matrix R, use either the functions rotx etc or look up the formula at Wikipedia.
%typically a point is a column vector. To match common definitions, transpose P
P=P'
%get center of rotation
C=P(:,end)
%translate
P=bsxfun(#minus,P,C)
%Rotate
P=R*P
%invert translation
P=bsxfun(#plus,P,C)

find corresponding peaks in matlab with 95% confidence interval

Suppose that we have following array:
0.196238259763928
0.0886250228175519
0.417543614272817
0.182403230538167
0.136500793051860
0.389922187581014
0.0344012946153299
0.381603315802419
0.0997542838649466
0.274807632628596
0.601652859233616
0.209431489000677
0.396925294300794
0.0351587496999554
0.177321874549738
0.369200511917405
0.287108838007101
0.477076452316346
0.127558716868438
0.792431584110476
0.0459982776925879
0.612598437936600
0.228340227044324
0.190267907472804
0.564751537228850
0.00269368929400299
0.940538666131177
0.101588565140294
0.426175626669060
0.600215481734847
0.127859067121782
0.985881201195063
0.0945679498528667
0.950077461673118
0.415212985598547
0.467423473845033
1.24336273213410
0.0848695928658021
1.84522775800633
0.289288949281834
1.38792131632743
1.73186592736729
0.554254947026916
3.46075557122590
0.0872957577705428
4.93259798197976
2.03544238985229
3.71059303259615
8.47095716918618
0.422940369071662
25.2287636895831
4.14535369056670
63.7312173032838
152.080907190007
1422.19492782494
832.134744027851
0.0220089962114756
60.8238733887811
7.71053463387430
10.4151913932115
11.3141744831953
0.988978595613829
8.65598040591953
0.219820300144944
3.92785491164888
2.28370963778411
1.60232807621444
2.51086405960291
0.0181622519984990
2.27469230188760
0.487809730727909
0.961063613990814
1.90435488292485
0.515640996120482
1.25933693517960
0.0953200831348589
1.52851575480462
0.582109930768162
0.933543409438383
0.717947488528521
0.0445235241119612
1.21157308704582
0.0942421028083462
0.536069075206508
0.821400666720535
0.308956823975938
1.28706199713640
0.0339217632187507
1.19575886464231
0.0853733920496230
0.736744959694641
0.635218502184121
0.262305581223588
0.986899895695809
0.0398800891449550
0.758792061180657
0.134279188964854
0.442531129290843
0.542782326712391
0.377221037448628
0.704787750202814
0.224180325609783
0.998785634315287
0.408055416702400
0.329684702125840
0.522384453408780
0.154542718256493
0.602294251721841
0.240357912028348
0.359040779285709
0.525224294805813
0.427539247203335
0.624034405807298
0.298184846094056
0.498659616687732
0.0962076792277457
0.430092706132805
0.656212420735658
0.278310520474744
0.866037361133916
0.184971060800812
0.481149730712771
0.624405636807668
0.382388147099945
0.435350646037440
0.216499523971397
1.22960953802959
0.330841706900755
0.891793067878849
0.628241046456751
0.278687691121678
1.06358076764171
0.365652714373067
1.34921178081181
0.652888708375276
0.861138633227739
1.02878577330537
0.591174450919664
1.93594290806582
0.497631035062465
1.14486512201656
0.978067581547298
0.948931658572253
2.01004088022982
0.917415940349743
2.24124811810385
1.42691656876436
2.15636037453584
1.92812357585099
1.12786835077183
4.81721425534142
1.70055431306602
4.87939454466131
3.90293284926105
5.16542230018432
10.5783535493504
1.74023535081791
27.0572221453758
7.78813114379733
69.2528169436690
167.769806437531
1490.03057130613
869.247150795648
3.27543244752518
62.3527480644562
9.74192115073051
13.6074209231800
10.5686495478844
7.70239986387120
9.62850426896699
9.85304975304259
7.09026325332085
12.8782040428502
16.3163128995995
7.00070066635845
74.1532966917877
4.80506505312457
1042.52337489620
1510.37374385290
118.514435606795
80.7915675273571
2.96352221859211
27.7825124315786
1.55102367292252
8.66382951478539
5.02910503820560
1.25219344189599
7.72195587189507
0.356973215117373
6.06702456628919
1.01953617014621
2.76489896186652
3.35353608882459
0.793376336025486
4.90341095941571
0.00742857354167949
5.07665716731356
1.16863474789604
4.47635486149688
4.33050121578669
2.42974020115261
9.79494608790444
0.0568839453395247
22.9153086380666
4.48791386399205
59.6962194708933
97.8636220152072
1119.97978883924
806.144299041605
7.33252581243942
57.0699524267842
0.900104994068117
15.2791339483160
3.31266162202546
3.20809490583211
5.36617545130941
0.648122925703121
3.90480316969632
0.0338850542128927
2.58828964019220
0.543604662856673
1.16385064506181
1.01835324272839
0.172915006573539
1.55998411282069
0.00221570175453666
1.14803074836796
0.0769335878967426
0.421762398811163
0.468260146832541
0.203765185125597
0.467641715366303
0.00142988680149041
0.698088976126660
0.0413316717103625
0.190548157914037
0.504713663418641
0.325697764871308
0.375910057283262
0.123307135682793
0.331115262928959
0.00263961045860704
0.204555648718379
0.139008751575803
0.182936666944843
0.154943314848474
0.0840483576044629
0.293075999812128
0.00306911699543199
0.272993318570981
0.0864711337990886
0.280495615619829
0.0910123210559269
0.148399626645134
0.141945002415500
0.0512001531781583
0.0295283557338525
In MATLAB it is very easy to find peaks using findpeaks, like so:
[pxx_peaks,location] = findpeaks(Pxx);
If we plot pxx_peaks, we get
plot(pxx_peaks)
Of course, besides these peaks, there are smaller peaks which are not shown on the picture, but my goal is to find all peaks which are 95-96% above all other peaks.
I have tried like this:
>> average = mean(pxx_peaks);
>> stand = std(pxx_peaks);
>> final_peaks = pxx_peaks( pxx_peaks > average + 3*stand );
The result of this is
>> final_peaks
final_peaks =
1.0e+03 *
1.4222
1.4900
1.5104
1.1200
but how to return their corresponding locations? I want to write it as one m-file, so please help me
EDIT
also please help me in this question: can I parameterize the confidence interval? For instance instead of 95%, I want to find peaks that are 60% above then other peaks, is it possible?
Note that 3σ ≈ 99.73%
As for your first question, it's easy, you just have to keep track of the locations in the same way as you do for the peaks:
inds = pxx_peaks > mean(pxx_peaks) + 3*std(pxx_peaks);
final_peaks = pxx_peaks(inds);
final_locations = location(inds);
plot(Pxx), hold on
plot(final_locations, final_peaks, 'r.')
As for your second question, that's a little more complicated. If you want to formulate it like you say, you'll have to convert a desired percentage to the correct number of σ. That involves an integration of the standard normal, and a root finding:
%// Convert confidence interval percentage to number-of-sigmas
F = #(P) fzero(#(sig) quadgk(#(x) exp(-x.^2/2),-sig,+sig)/sqrt(2*pi) - P/100, 1);
% // Repeat with the desired percentage
inds = pxx_peaks > mean(pxx_peaks) + F(63)*std(pxx_peaks); %// 63%
final_peaks = pxx_peaks(inds);
final_locations = location(inds);
plot(final_locations, final_peaks, 'r.')

Issues with using neural network

I am having an issue with using neural networks. I started with something simple. I just used nntool with one hidden layer(with one neuron) with linear activation function. For the output also, I used the linear activation function. I just fed my Xs and Ys to the neural network tool and got the results on the testing set.
I compared that with normal ridge function in matlab.
I could see that neural network one performed much worse than ridge function. The first reason is that there are lots of negative values in the predictions, when my target is only positive. Ridge regression gave about 800 -ve values while nn gave around 5000 -ve values which totally ruined the accuracy of nntool.
Since nntool can be used to perform linear regression, why nntool is not performing as well as ridge regression?
What about the negative values what's the reason behind it and how to enforce positive values?
Here is what I use to train the network
targets = dayofyear_targets(:, i+1);
net = newfit(train_data', targets', 5);
net.performFcn = 'mae';
net.layers{2}.transferFcn = 'purelin';
net.trainParam.max_fail = 10;
net.layers{1}.transferFcn = 'tansig';
net = train(net, train_data', targets');
results{n}(:, i) = sim(net, train_data')
Here is the link to my data https://www.dropbox.com/s/0wcj2y6x6jd2vzm/data.mat
Max value of target = 31347900
Min value of target = 12000
Std of target = 7.8696e+06
Mean of target = 1.6877e+07
Mean of input data(all features) = 0
Max of input features = 318.547660595906 170.087177689426 223.932169893425 168.036356568791 123.552142071032 119.203127702922 104.835054133360 103.991193950830 114.185533613098 89.9463148033190 146.239919217723 87.4695246220901 54.0670595471470 138.770752686700 206.797850609643 66.1464335873203 74.2115064643667 57.5743248336263 34.9080123850414 51.0189601377110 28.2306033402457 59.0128127003956 109.067637217394 307.093253638216 103.049923948310 62.8146642809675 200.015259541953 116.661885835164 62.5567327185901 53.8264756204627 58.8389745246703 176.143066044763 109.758983758653 60.7299481351038 58.6442946860097 46.1757085114781 336.346653669636 188.317461118279 224.964813627679 131.036150096149 137.154788108331 101.660743039860 79.4118778807977 71.4376953724718 90.5561535067498 93.4577679861134 336.454999007931 188.478832826684 225.143399783080 131.129689699137 137.344882971079 101.735403131103 79.4552027696783 71.4625401815804 90.6415702940799 93.4391513416449 143.529912145748 139.846472255779 69.3652595100658 141.229186078884 142.169055101267 61.7542261599789 152.193483162673 142.600096412522 100.923921522930 117.430577166104 95.7956956529542 97.2020336095432 53.3982366051064 67.5119662506151 51.6323341924432 45.0561119607012 42.9378617679366 129.976361335597 142.673696349981 80.3147691198763 71.3756376053104 63.4368122986219 44.5956741629581 53.4495610863871 58.7095984295653 45.6460094353149 39.1823704174863
Min of input features = -20.4980450089652 -216.734809594199 -208.377002401333 -166.153721182481 -164.971591950319 -125.794572986012 -120.613947913087 -90.5034237473168 -134.579349373396 -83.3049591539539 -207.619242338228 -61.4759254872546 -53.9649370954913 -160.211101485606 -19.9518644140863 -71.7889519995308 -53.8121825067231 -63.5789507316766 -51.1490159556167 -45.3464904959582 -31.6240685853237 -44.3432050298007 -34.2568293807697 -266.636505655523 -146.890814672460 -74.1775783694521 -132.270950595716 -98.6307112885543 -74.9183852672982 -62.4830008457438 -50.9507510122653 -140.067268423566 -93.0276674484945 -46.0819928136273 -59.2773430879897 -42.5451478861616 -31.2745435717060 -167.227723082743 -165.559585876166 -111.610031207207 -115.227936838215 -114.221934636009 -100.253661816324 -92.8856877745228 -86.1818201082433 -70.8388921500665 -31.4414388158249 -167.300019804654 -165.623030944544 -111.652804647492 -115.385214399271 -114.284846572143 -100.330328846390 -93.0745562342156 -86.1595126080268 -70.9022836842639 -255.769604133190 -258.123896542916 -55.1273177937196 -254.950820371016 -237.808870530211 -48.7785774080310 -213.713286177228 -246.086347088813 -125.941623423708 -116.383806139418 -79.2526295146070 -73.5322630343671 -59.5627573635424 -59.8471670606059 -64.6956071579830 -44.2151862981818 -37.8399444185350 -165.171915536922 -61.7557905578095 -97.6861764054228 -48.1218110960853 -57.4061842741057 -55.2734701363017 -45.7001129953926 -46.0498982933589 -40.8981619566775 -38.8963700558353
std of input features = 32.6229352625809 23.9923892231470 20.2491752921310 17.7607289226108 16.0041198617605 14.0220141286592 12.5650595823472 11.8017618129464 11.3556667194196 10.5382275790401 79.9955119586915 23.4033030770963 13.6077112635514 70.2453437964039 20.5151528145556 16.0996176741868 14.1570158221881 12.9623353379168 10.9374477375002 8.96886512490408 8.14900837031189 7.08031665751228 6.91909266176659 78.5294157406654 29.1855289103841 17.8430919295327 76.8762213278391 26.2042738515736 17.0642403174281 14.5812208141282 11.0486273595910 76.5345079046264 27.0522533813606 15.3463708931398 14.6265102381665 11.1878734989856 39.6000366966711 26.1651093368473 23.0548487219797 17.7418206149244 16.6414818214387 13.3202865460648 12.3418432467697 11.6967799788894 10.8462000495929 10.4594143594862 39.5881760483459 26.1554037787439 23.0480628017814 17.7384413542873 16.6400399748141 13.3209601910848 12.3455390551215 11.6986154850079 10.8471011424912 10.4616180751664 84.5166510619818 84.2685711235292 17.6461724536770 84.5782246722891 84.1536835974735 15.4898443616888 84.4295575869672 59.7308251367612 27.7396138514949 24.5736295499757 18.5604346514449 15.5172516938784 12.5038199620381 11.8900580903921 10.7970958504272 9.68255544149509 8.96604859535919 61.8751159200641 22.9395284949373 20.3023241153997 18.6165218063180 13.5503823185794 12.1726984705006 11.1423398921756 9.54944172482809 8.81223325514952 7.92656384557323
Number of samples = 5113
Dimension of input = 5113x83
Dimension of output = 5113x1
Actual target
Predicted values

How to skip NaN points when loading ASCII file?

I am trying to load an ASCII files that contains a points which are not defined in the 2nd colomns in certain lines, I am doing so and getting this error:
S = load('bond_order_correlation4A.dat')
??? Error using ==> load
Unknown text on line number 7 of ASCII file
C:\Users\VAIO\Desktop\MATLAB\R2010b\bin\bond_order_correlation4A.dat
"-nan+-nani".
How can I avoid this error, in other way how can I skip reading the lines that have a NaN?
Any ideas?
here are the lines of the data:
1.751500e+01 0.900636+0.000000i
1.854500e+01 0.910675+0.000000i
1.957500e+01 0.901020+0.000000i
2.060500e+01 0.866812+0.000000i
2.163500e+01 0.826753+0.000000i
2.266500e+01 0.736222+0.000000i
2.369500e+01 -nan+-nani
2.472500e+01 -nan+-nani
2.575500e+01 -nan+-nani
2.678500e+01 -nan+-nani
2.781500e+01 -nan+-nani
2.884500e+01 0.804500+0.000000i
2.987500e+01 0.863660+0.000000i
3.090500e+01 0.899600+0.000000i
3.193500e+01 0.912361+0.000000i
3.296500e+01 0.906553+0.000000i
3.399500e+01 0.883229+0.000000i
3.502500e+01 0.873248+0.000000i
3.605500e+01 0.903132+0.000000i
3.708500e+01 0.909807+0.000000i
3.811500e+01 0.904406+0.000000i
3.914500e+01 0.886968+0.000000i
4.017500e+01 0.860080+0.000000i
4.120500e+01 0.810715+0.000000i
4.223500e+01 -nan+-nani
4.326500e+01 -nan+-nani
4.429500e+01 -nan+-nani
4.532500e+01 0.812973+0.000000i
4.635500e+01 0.863783+0.000000i
4.738500e+01 0.895398+0.000000i
4.841500e+01 0.908204+0.000000i
4.944500e+01 0.908985+0.000000i
5.047500e+01 0.900171+0.000000i
5.150500e+01 0.882722+0.000000i
5.253500e+01 0.851140+0.000000i
5.356500e+01 0.890132+0.000000i
5.459500e+01 0.904564+0.000000i
5.562500e+01 0.908607+0.000000i
5.665500e+01 0.904241+0.000000i
5.768500e+01 0.891706+0.000000i
5.871500e+01 0.875118+0.000000i
5.974500e+01 0.844325+0.000000i
6.077500e+01 0.848961+0.000000i
6.180500e+01 0.883005+0.000000i
6.283500e+01 0.900617+0.000000i
6.386500e+01 0.907607+0.000000i
6.489500e+01 0.903102+0.000000i
6.592500e+01 0.903907+0.000000i
6.695500e+01 0.905971+0.000000i
6.798500e+01 0.901497+0.000000i
6.901500e+01 0.891710+0.000000i
7.004500e+01 0.873431+0.000000i
7.107500e+01 0.857750+0.000000i
7.210500e+01 0.892680+0.000000i
7.313500e+01 0.905379+0.000000i
7.416500e+01 0.907424+0.000000i
7.519500e+01 0.904534+0.000000i
7.622500e+01 0.891604+0.000000i
7.725500e+01 0.874679+0.000000i
7.828500e+01 0.880488+0.000000i
7.931500e+01 0.899794+0.000000i
8.034500e+01 0.908564+0.000000i
8.137500e+01 0.906300+0.000000i
8.240500e+01 0.898721+0.000000i
8.343500e+01 0.895449+0.000000i
8.446500e+01 0.900390+0.000000i
8.549500e+01 0.901614+0.000000i
8.652500e+01 0.896143+0.000000i
8.755500e+01 0.884075+0.000000i
8.858500e+01 0.860837+0.000000i
8.961500e+01 0.845785+0.000000i
9.064500e+01 0.883891+0.000000i
9.167500e+01 0.902221+0.000000i
9.270500e+01 0.905519+0.000000i
9.373500e+01 0.901589+0.000000i
9.476500e+01 0.892879+0.000000i
9.579500e+01 0.896607+0.000000i
9.682500e+01 0.900943+0.000000i
9.785500e+01 0.904287+0.000000i
9.888500e+01 0.901320+0.000000i
9.991500e+01 0.892640+0.000000i
1.009450e+02 0.884730+0.000000i
1.019750e+02 0.888384+0.000000i
1.030050e+02 0.895942+0.000000i
1.040350e+02 0.894981+0.000000i
1.050650e+02 0.887988+0.000000i
1.060950e+02 0.874380+0.000000i
1.071250e+02 0.853539+0.000000i
1.081550e+02 0.839207+0.000000i
1.091850e+02 0.867059+0.000000i
1.102150e+02 0.890265+0.000000i
1.112450e+02 0.899481+0.000000i
1.122750e+02 0.903685+0.000000i
1.133050e+02 0.899493+0.000000i
1.143350e+02 0.895112+0.000000i
1.153650e+02 0.896775+0.000000i
1.163950e+02 0.893841+0.000000i
1.174250e+02 0.885450+0.000000i
1.184550e+02 0.874641+0.000000i
1.194850e+02 0.871421+0.000000i
enter code here
You could start by substituting "-nan+-nani" in your data file with "NaN" so that MATLAB can read it correctly, and then as #juampa said, filter out the "NaN" values however you want in MATLAB.
To perform the substitution quickly and easily, I highly recommend vim. Just open the data file in vim and type:
:%s/-nan+-nani/NaN/g
This will do a global substitution, replacing "-nan+-nani" with "NaN" everywhere in the file without asking to confirm each one. If you want to confirm then change the above line to:
:%s/-nan+-nani/NaN/gc
MATLAB should be able to read the file and handle the "NaN" values as NaN.
For MatLab, substitute those strings by nan, or NaN (case insensitive), and it will import them correctly as NaN. Later you can filter them easily within the MatLab runtime
M(find(M) == NaN,:) = []