How to obtain the max % drawdown in the List of Trades? - metadata

In a Strategy,
in LIST OF TRADES,
I want to obtain the max % of the drawdown column.
This is not the Max Drawdown of the Overview.
Sadly, PERFORMANCE SUMMARY does not offer this number.
Please ?

try this out bro
strategy_closedtrades_max_drawdown_percent(trade_num) =>
endP = strategy.closedtrades.entry_price(trade_num) * math.abs(strategy.closedtrades.size(trade_num))
prof = strategy.closedtrades.max_drawdown(trade_num)
result = prof / endP *100
lastDrawDown = strategy_closedtrades_max_drawdown_percent(strategy.closedtrades-1)
Cheers and best of luck with your coding and trading

Related

Ruby version of gamma.fit from scipy.stats

As the title suggests, I am trying to find a function that can take an array of floats and find a distribution that fits my data.
From here I'll use it to find the CDF of new data I am passing it.
I have installed and looked through the sciruby Distribution and NArray docs but nothing appears to match the 'fit' method
The python code looks like this
# Approach 2: Model-based percentiles.
# Step 1: Find a Gamma distribution that fits your data
alpha, _, beta = stats.gamma.fit(data, floc = 0.)
# Step 2: Use that distribution's CDF to get percentiles.
scores = 100-100*stats.gamma.cdf(new_data, a = alpha, scale=beta)
print(scores)
Thank you in advance
After a deep dive into other packages and a lot of help from someone from the 'Cross Validated' forum, I have the answer needed.
In order to obtain the needed 'alpha' and 'beta' values that will give the shape and rate of the gamma distribution, you will need to discover what the 'variance' value is in the data.
There are a few approaches to achieving this. See here for more information;
https://www.statisticshowto.com/probability-and-statistics/descriptive-statistics/sample-variance/
Code examples;
data = [<insert your numbers>]
sum = data.sum
sum_square_mean = (sum**2) / data.size
all_square = data.map { |n| n**2 }.sum
net_square = all_square - sum_square_mean
minus_one = data.size - 1
variance = net_square / minus_one
mean = data.sum(0.0) / data.size
mean_squared = mean**2
alpha = mean_squared / variance
beta = mean / variance
theta = variance / mean
The line 'minus_one' isn't completely necessary but it's done in statistics to reduce the error rate. Look up Bessels correction. You can just get variance from net_square / data.size.
Second option using the 'descriptive_statistics' gem
require('descriptive_statistics')
# doesn't account for bessel's correction
#alpha = (data.mean**2) / data.variance
#beta = data.mean / data.variance
#theta = data.variance / data.mean
Once you have these values, you can use the cdf function from the Distribution Gem , docs here
The next stage is then to pass the values into this function which will return a percentile.
Make sure to use the '1 over beta' calculation or it won't work
percentile = 100 - (100 * Distribution::Gamma::Ruby_.cdf(x, alpha, 1 / beta))
You may have noticed I have also calculated #theta
This was for a separate function that means I can also return the value from my gamma distribution by passing in the percentile. Used like so
value = Distribution::Gamma.quantile(0.5, alpha, theta)
This function is also known as 'inverse cdf', 'inverse cumulative distribution function', 'probability point function' or 'percentile point function'. Here it is simply named 'quantile'.
For more information on gamma distributions, please see the wiki
Gamma Distribution

Getting the correct output units from the PLOMB (Lomb-scargle periodogram) function

I am trying to analyze timeseries of wheel turns that were sampled at 1 minute intervals for 10 days. t is a 1 x 14000 array that goes from .1666 hours to 240 hours. analysis.timeseries.(grp).(chs) is a 1 x 14000 array for each of my groups of interest and their specific channels that specifize activity at each minute sampled. I'm interested in collecting the maximum power and the frequency it occurs at. My problem is I'm not sure what units f is coming out in. I would like to have it return in cycles per hour and span to a maximum period of 30 hours. I tried to use the Galileo example in the documentation as a guide, but it didn't seem to work.
Below is my code:
groups = {'GFF' 'GMF' 'SFF' 'SMF'};
chgroups = {chnamesGF chnamesGM chnamesSF chnamesSM};
t1 = (t * 3600); %matlab treats this as seconds so convert it to an hour form
onehour = seconds(hours(1));
for i = 1:4
grp = groups{1,i};
chn = chgroups{1,i};
for channel = 1:length(chn)
chs = chn{channel,1};
[pxx,f]= plomb(analysis.timeseries.(grp).(chs),t, 30/onehour,'normalized');
analysis.pxx.(grp).(chs) = pxx;
analysis.f.(grp).(chs) = f;
analysis.lsp.power.(grp).(chs) = max(pxx);
[row,col,v] = find(analysis.pxx.(grp).(chs) == analysis.lsp.power.(grp).(chs));
analysis.lsp.tau.(grp).(chs) = analysis.f.(grp).(chs)(row);
end
end
Not really an answer but it is hard to put a image in a comment.
Judging by this (plomb manual matlab),
I think that pxx is without dimension as for f is is the frequency so 1/(dimension of t) dimension. If your t is in hours I would say h^-1.
So I'd rather say try
[pxx,f]= plomb(analysis.timeseries.(grp).(chs),t*30.0/onehour,'normalized');

How to implement cumulative score in Matlab

Can someone explain me what is "cumulative score" and how to implement it on Matlab?
I searched on the net and i found out that the cumulative score is defined as the percentage of test images such that the absolute error is not higher than the threshold t, (in years in this study).
I read in an article that the cumulative score is calculated as shown in the image.
I also used the "one category error" in my study, calculated as following:
correct = abs(predict_label - test_label) <= 1;
num_correct = length(find(correct));
accuracy2Svmk2 = (num_correct ./ length(test_label)) * 100;
these two metrics can be the same more or less?
Thank you.
Maybe you forgot the "1-":
accuracy2Svmk2 = (1 - (num_correct ./ length(test_label))) * 100;

average wind direction using histc matlab

Hello this question might be easy but i am struggling to get average wind directions for 1 year. I need hourly averages to compare with concentration measurements. My wind measurements are every minute in degree. So my idea was to use the histc function in matlab to get the most common winddirection within the hour. this works for 1 h but how do i create a loop which gives me hourly values for a year.
here is the code
wdd=winddirections in degree(vectorsize e.g for a year 525600)
binranges = [0:10:360];
[bincounts,ind] = histc(wdd(1:60),binranges);
[num idx] = max(bincounts(:));
wd_out=binranges(idx);
kind regards matthias
How about this one -
binranges = [0:10:360]
[bincounts,ind] = histc(reshape(wdd,60,[]),binranges)
[nums idxs] = max(bincounts)
wd_out=binranges(idxs)
What I would do is:
wdd_phour=reshape(wdd,60,525600/60); % get a matrix of size 60(min) X hours per year
mean_phour=mean(wdd_phour,1); % compute the average of each 60 mins for every our in a year

choosing a percentage of a dataset

I am new to matlab and I cant find anything in the documentation for this, I have a method of sampling a dataset but I was wondering rather than using direct numbers how I can use a percentage:
normIdx = strmatch('normal.', TestDataLabels);
normalSubset = Testdata(normIdx, :);
normal = randperm(size(normalSubset , 1));
p = (normal(1:10000))'; % here I choose 10000 samples but I would like to use a percentage
You mean like this?
pcnt = 75; % The percent of original data set size you wish your sample size to be
sampleN = ceil( (pcnt/100) * length(normal) ); % figure out what pcnt percent of original N is, and round upward
p = normal(1:sampleN)';