Machine Learning: Lorentzian Classification█ OVERVIEW
A Lorentzian Distance Classifier (LDC) is a Machine Learning classification algorithm capable of categorizing historical data from a multi-dimensional feature space. This indicator demonstrates how Lorentzian Classification can also be used to predict the direction of future price movements when used as the distance metric for a novel implementation of an Approximate Nearest Neighbors (ANN) algorithm.
█ BACKGROUND
In physics, Lorentzian space is perhaps best known for its role in describing the curvature of space-time in Einstein's theory of General Relativity (2). Interestingly, however, this abstract concept from theoretical physics also has tangible real-world applications in trading.
Recently, it was hypothesized that Lorentzian space was also well-suited for analyzing time-series data (4), (5). This hypothesis has been supported by several empirical studies that demonstrate that Lorentzian distance is more robust to outliers and noise than the more commonly used Euclidean distance (1), (3), (6). Furthermore, Lorentzian distance was also shown to outperform dozens of other highly regarded distance metrics, including Manhattan distance, Bhattacharyya similarity, and Cosine similarity (1), (3). Outside of Dynamic Time Warping based approaches, which are unfortunately too computationally intensive for PineScript at this time, the Lorentzian Distance metric consistently scores the highest mean accuracy over a wide variety of time series data sets (1).
Euclidean distance is commonly used as the default distance metric for NN-based search algorithms, but it may not always be the best choice when dealing with financial market data. This is because financial market data can be significantly impacted by proximity to major world events such as FOMC Meetings and Black Swan events. This event-based distortion of market data can be framed as similar to the gravitational warping caused by a massive object on the space-time continuum. For financial markets, the analogous continuum that experiences warping can be referred to as "price-time".
Below is a side-by-side comparison of how neighborhoods of similar historical points appear in three-dimensional Euclidean Space and Lorentzian Space:
This figure demonstrates how Lorentzian space can better accommodate the warping of price-time since the Lorentzian distance function compresses the Euclidean neighborhood in such a way that the new neighborhood distribution in Lorentzian space tends to cluster around each of the major feature axes in addition to the origin itself. This means that, even though some nearest neighbors will be the same regardless of the distance metric used, Lorentzian space will also allow for the consideration of historical points that would otherwise never be considered with a Euclidean distance metric.
Intuitively, the advantage inherent in the Lorentzian distance metric makes sense. For example, it is logical that the price action that occurs in the hours after Chairman Powell finishes delivering a speech would resemble at least some of the previous times when he finished delivering a speech. This may be true regardless of other factors, such as whether or not the market was overbought or oversold at the time or if the macro conditions were more bullish or bearish overall. These historical reference points are extremely valuable for predictive models, yet the Euclidean distance metric would miss these neighbors entirely, often in favor of irrelevant data points from the day before the event. By using Lorentzian distance as a metric, the ML model is instead able to consider the warping of price-time caused by the event and, ultimately, transcend the temporal bias imposed on it by the time series.
For more information on the implementation details of the Approximate Nearest Neighbors (ANN) algorithm used in this indicator, please refer to the detailed comments in the source code.
█ HOW TO USE
Below is an explanatory breakdown of the different parts of this indicator as it appears in the interface:
Below is an explanation of the different settings for this indicator:
General Settings:
Source - This has a default value of "hlc3" and is used to control the input data source.
Neighbors Count - This has a default value of 8, a minimum value of 1, a maximum value of 100, and a step of 1. It is used to control the number of neighbors to consider.
Max Bars Back - This has a default value of 2000.
Feature Count - This has a default value of 5, a minimum value of 2, and a maximum value of 5. It controls the number of features to use for ML predictions.
Color Compression - This has a default value of 1, a minimum value of 1, and a maximum value of 10. It is used to control the compression factor for adjusting the intensity of the color scale.
Show Exits - This has a default value of false. It controls whether to show the exit threshold on the chart.
Use Dynamic Exits - This has a default value of false. It is used to control whether to attempt to let profits ride by dynamically adjusting the exit threshold based on kernel regression.
Feature Engineering Settings:
Note: The Feature Engineering section is for fine-tuning the features used for ML predictions. The default values are optimized for the 4H to 12H timeframes for most charts, but they should also work reasonably well for other timeframes. By default, the model can support features that accept two parameters (Parameter A and Parameter B, respectively). Even though there are only 4 features provided by default, the same feature with different settings counts as two separate features. If the feature only accepts one parameter, then the second parameter will default to EMA-based smoothing with a default value of 1. These features represent the most effective combination I have encountered in my testing, but additional features may be added as additional options in the future.
Feature 1 - This has a default value of "RSI" and options are: "RSI", "WT", "CCI", "ADX".
Feature 2 - This has a default value of "WT" and options are: "RSI", "WT", "CCI", "ADX".
Feature 3 - This has a default value of "CCI" and options are: "RSI", "WT", "CCI", "ADX".
Feature 4 - This has a default value of "ADX" and options are: "RSI", "WT", "CCI", "ADX".
Feature 5 - This has a default value of "RSI" and options are: "RSI", "WT", "CCI", "ADX".
Filters Settings:
Use Volatility Filter - This has a default value of true. It is used to control whether to use the volatility filter.
Use Regime Filter - This has a default value of true. It is used to control whether to use the trend detection filter.
Use ADX Filter - This has a default value of false. It is used to control whether to use the ADX filter.
Regime Threshold - This has a default value of -0.1, a minimum value of -10, a maximum value of 10, and a step of 0.1. It is used to control the Regime Detection filter for detecting Trending/Ranging markets.
ADX Threshold - This has a default value of 20, a minimum value of 0, a maximum value of 100, and a step of 1. It is used to control the threshold for detecting Trending/Ranging markets.
Kernel Regression Settings:
Trade with Kernel - This has a default value of true. It is used to control whether to trade with the kernel.
Show Kernel Estimate - This has a default value of true. It is used to control whether to show the kernel estimate.
Lookback Window - This has a default value of 8 and a minimum value of 3. It is used to control the number of bars used for the estimation. Recommended range: 3-50
Relative Weighting - This has a default value of 8 and a step size of 0.25. It is used to control the relative weighting of time frames. Recommended range: 0.25-25
Start Regression at Bar - This has a default value of 25. It is used to control the bar index on which to start regression. Recommended range: 0-25
Display Settings:
Show Bar Colors - This has a default value of true. It is used to control whether to show the bar colors.
Show Bar Prediction Values - This has a default value of true. It controls whether to show the ML model's evaluation of each bar as an integer.
Use ATR Offset - This has a default value of false. It controls whether to use the ATR offset instead of the bar prediction offset.
Bar Prediction Offset - This has a default value of 0 and a minimum value of 0. It is used to control the offset of the bar predictions as a percentage from the bar high or close.
Backtesting Settings:
Show Backtest Results - This has a default value of true. It is used to control whether to display the win rate of the given configuration.
█ WORKS CITED
(1) R. Giusti and G. E. A. P. A. Batista, "An Empirical Comparison of Dissimilarity Measures for Time Series Classification," 2013 Brazilian Conference on Intelligent Systems, Oct. 2013, DOI: 10.1109/bracis.2013.22.
(2) Y. Kerimbekov, H. Ş. Bilge, and H. H. Uğurlu, "The use of Lorentzian distance metric in classification problems," Pattern Recognition Letters, vol. 84, 170–176, Dec. 2016, DOI: 10.1016/j.patrec.2016.09.006.
(3) A. Bagnall, A. Bostrom, J. Large, and J. Lines, "The Great Time Series Classification Bake Off: An Experimental Evaluation of Recently Proposed Algorithms." ResearchGate, Feb. 04, 2016.
(4) H. Ş. Bilge, Yerzhan Kerimbekov, and Hasan Hüseyin Uğurlu, "A new classification method by using Lorentzian distance metric," ResearchGate, Sep. 02, 2015.
(5) Y. Kerimbekov and H. Şakir Bilge, "Lorentzian Distance Classifier for Multiple Features," Proceedings of the 6th International Conference on Pattern Recognition Applications and Methods, 2017, DOI: 10.5220/0006197004930501.
(6) V. Surya Prasath et al., "Effects of Distance Measure Choice on KNN Classifier Performance - A Review." .
█ ACKNOWLEDGEMENTS
@veryfid - For many invaluable insights, discussions, and advice that helped to shape this project.
@capissimo - For open sourcing his interesting ideas regarding various KNN implementations in PineScript, several of which helped inspire my original undertaking of this project.
@RikkiTavi - For many invaluable physics-related conversations and for his helping me develop a mechanism for visualizing various distance algorithms in 3D using JavaScript
@jlaurel - For invaluable literature recommendations that helped me to understand the underlying subject matter of this project.
@annutara - For help in beta-testing this indicator and for sharing many helpful ideas and insights early on in its development.
@jasontaylor7 - For helping to beta-test this indicator and for many helpful conversations that helped to shape my backtesting workflow
@meddymarkusvanhala - For helping to beta-test this indicator
@dlbnext - For incredibly detailed backtesting testing of this indicator and for sharing numerous ideas on how the user experience could be improved.
Machinelearning
MLExtensionsLibrary "MLExtensions"
normalizeDeriv(src, quadraticMeanLength)
Returns the smoothed hyperbolic tangent of the input series.
Parameters:
src : The input series (i.e., the first-order derivative for price).
quadraticMeanLength : The length of the quadratic mean (RMS).
Returns: nDeriv The normalized derivative of the input series.
normalize(src, min, max)
Rescales a source value with an unbounded range to a target range.
Parameters:
src : The input series
min : The minimum value of the unbounded range
max : The maximum value of the unbounded range
Returns: The normalized series
rescale(src, oldMin, oldMax, newMin, newMax)
Rescales a source value with a bounded range to anther bounded range
Parameters:
src : The input series
oldMin : The minimum value of the range to rescale from
oldMax : The maximum value of the range to rescale from
newMin : The minimum value of the range to rescale to
newMax : The maximum value of the range to rescale to
Returns: The rescaled series
color_green(prediction)
Assigns varying shades of the color green based on the KNN classification
Parameters:
prediction : Value (int|float) of the prediction
Returns: color
color_red(prediction)
Assigns varying shades of the color red based on the KNN classification
Parameters:
prediction : Value of the prediction
Returns: color
tanh(src)
Returns the the hyperbolic tangent of the input series. The sigmoid-like hyperbolic tangent function is used to compress the input to a value between -1 and 1.
Parameters:
src : The input series (i.e., the normalized derivative).
Returns: tanh The hyperbolic tangent of the input series.
dualPoleFilter(src, lookback)
Returns the smoothed hyperbolic tangent of the input series.
Parameters:
src : The input series (i.e., the hyperbolic tangent).
lookback : The lookback window for the smoothing.
Returns: filter The smoothed hyperbolic tangent of the input series.
tanhTransform(src, smoothingFrequency, quadraticMeanLength)
Returns the tanh transform of the input series.
Parameters:
src : The input series (i.e., the result of the tanh calculation).
smoothingFrequency
quadraticMeanLength
Returns: signal The smoothed hyperbolic tangent transform of the input series.
n_rsi(src, n1, n2)
Returns the normalized RSI ideal for use in ML algorithms.
Parameters:
src : The input series (i.e., the result of the RSI calculation).
n1 : The length of the RSI.
n2 : The smoothing length of the RSI.
Returns: signal The normalized RSI.
n_cci(src, n1, n2)
Returns the normalized CCI ideal for use in ML algorithms.
Parameters:
src : The input series (i.e., the result of the CCI calculation).
n1 : The length of the CCI.
n2 : The smoothing length of the CCI.
Returns: signal The normalized CCI.
n_wt(src, n1, n2)
Returns the normalized WaveTrend Classic series ideal for use in ML algorithms.
Parameters:
src : The input series (i.e., the result of the WaveTrend Classic calculation).
n1
n2
Returns: signal The normalized WaveTrend Classic series.
n_adx(highSrc, lowSrc, closeSrc, n1)
Returns the normalized ADX ideal for use in ML algorithms.
Parameters:
highSrc : The input series for the high price.
lowSrc : The input series for the low price.
closeSrc : The input series for the close price.
n1 : The length of the ADX.
regime_filter(src, threshold, useRegimeFilter)
Parameters:
src
threshold
useRegimeFilter
filter_adx(src, length, adxThreshold, useAdxFilter)
filter_adx
Parameters:
src : The source series.
length : The length of the ADX.
adxThreshold : The ADX threshold.
useAdxFilter : Whether to use the ADX filter.
Returns: The ADX.
filter_volatility(minLength, maxLength, useVolatilityFilter)
filter_volatility
Parameters:
minLength : The minimum length of the ATR.
maxLength : The maximum length of the ATR.
useVolatilityFilter : Whether to use the volatility filter.
Returns: Boolean indicating whether or not to let the signal pass through the filter.
backtest(high, low, open, startLongTrade, endLongTrade, startShortTrade, endShortTrade, isStopLossHit, maxBarsBackIndex, thisBarIndex)
Performs a basic backtest using the specified parameters and conditions.
Parameters:
high : The input series for the high price.
low : The input series for the low price.
open : The input series for the open price.
startLongTrade : The series of conditions that indicate the start of a long trade.`
endLongTrade : The series of conditions that indicate the end of a long trade.
startShortTrade : The series of conditions that indicate the start of a short trade.
endShortTrade : The series of conditions that indicate the end of a short trade.
isStopLossHit : The stop loss hit indicator.
maxBarsBackIndex : The maximum number of bars to go back in the backtest.
thisBarIndex : The current bar index.
Returns: A tuple containing backtest values
init_table()
init_table()
Returns: tbl The backtest results.
update_table(tbl, tradeStatsHeader, totalTrades, totalWins, totalLosses, winLossRatio, winrate, stopLosses)
update_table(tbl, tradeStats)
Parameters:
tbl : The backtest results table.
tradeStatsHeader : The trade stats header.
totalTrades : The total number of trades.
totalWins : The total number of wins.
totalLosses : The total number of losses.
winLossRatio : The win loss ratio.
winrate : The winrate.
stopLosses : The total number of stop losses.
Returns: Updated backtest results table.
kNNLibrary "kNN"
Collection of experimental kNN functions. This is a work in progress, an improvement upon my original kNN script:
The script can be recreated with this library. Unlike the original script, that used multiple arrays, this has been reworked with the new Pine Script matrix features.
To make a kNN prediction, the following data should be supplied to the wrapper:
kNN : filter type. Right now either Binary or Percent . Binary works like in the original script: the system stores whether the price has increased (+1) or decreased (-1) since the previous knnStore event (called when either long or short condition is supplied). Percent works the same, but the values stored are the difference of prices in percents. That way larger differences in prices would give higher scores.
k : number k. This is how many nearest neighbors are to be selected (and summed up to get the result).
skew : kNN minimum difference. Normally, the prediction is done with a simple majority of the neighbor votes. If skew is given, then more than a simple majority is needed for a prediction. This also means that there are inputs for which no prediction would be given (if the majority votes are between -skew and +skew). Note that in Percent mode more profitable trades will have higher voting power.
depth : kNN matrix size limit. Originally, the whole available history of trades was used to make a prediction. This not only requires more computational power, but also neglects the fact that the market conditions are changing. This setting restricts the memory matrix to a finite number of past trades.
price : price series
long : long condition. True if the long conditions are met, but filters are not yet applied. For example, in my original script, trades are only made on crossings of fast and slow MAs. So, whenever it is possible to go long, this value is set true. False otherwise.
short : short condition. Same as long , but for short condition.
store : whether the inputs should be stored. Additional filters may be applied to prevent bad trades (for example, trend-based filters), so if you only need to consult kNN without storing the trade, this should be set to false.
feature1 : current value of feature 1. A feature in this case is some kind of data derived from the price. Different features may be used to analyse the price series. For example, oscillator values. Not all of them may be used for kNN prediction. As the current kNN implementation is 2-dimensional, only two features can be used.
feature2 : current value of feature 2.
The wrapper returns a tuple: [ longOK, shortOK ]. This is a pair of filters. When longOK is true, then kNN predicts a long trade may be taken. When shortOK is true, then kNN predicts a short trade may be taken. The kNN filters are returned whenever long or short conditions are met. The trade is supposed to happen when long or short conditions are met and when the kNN filter for the desired direction is true.
Exported functions :
knnStore(knn, p1, p2, src, maxrows)
Store the previous trade; buffer the current one until results are in. Results are binary: up/down
Parameters:
knn : knn matrix
p1 : feature 1 value
p2 : feature 2 value
src : current price
maxrows : limit the matrix size to this number of rows (0 of no limit)
Returns: modified knn matrix
knnStorePercent(knn, p1, p2, src, maxrows)
Store the previous trade; buffer the current one until results are in. Results are in percents
Parameters:
knn : knn matrix
p1 : feature 1 value
p2 : feature 2 value
src : current price
maxrows : limit the matrix size to this number of rows (0 of no limit)
Returns: modified knn matrix
knnGet(distance, result)
Get neighbours by getting k results with the smallest distances
Parameters:
distance : distance array
result : result array
Returns: array slice of k results
knnDistance(knn, p1, p2)
Create a distance array from the two given parameters
Parameters:
knn : knn matrix
p1 : feature 1 value
p2 : feature 2 value
Returns: distance array
knnSum(knn, p1, p2, k)
Make a prediction, finding k nearest neighbours and summing them up
Parameters:
knn : knn matrix
p1 : feature 1 value
p2 : feature 2 value
k : sum k nearest neighbors
Returns: sum of k nearest neighbors
doKNN(kNN, k, skew, depth, price, long, short, store, feature1, feature2)
execute kNN filter
Parameters:
kNN : filter type
k : number k
skew : kNN minimum difference
depth : kNN matrix size limit
price : series
long : long condition
short : short condition
store : store the supplied features (if false, only checks the results without storage)
feature1 : feature 1 value
feature2 : feature 2 value
Returns: filter output
Golden SlopeGolden Slope is an ATR based trend tool that mixes KNN machine learning to allow you to confirm your entry and exits, which can give out significantly more accurate signals.
Flag and rectangle signals are machine learning signals, they confirm an entry and exit position. You can use entry and exit signals alone but it's more accurate to confirm with machine learning signals. The idea is to either see a machine learning signal first and confirm it by Golden Slope entry or the other way around.
PS. Watch out if candle starts hitting the golden belly (or the yellow area after an entry signal is given because it can indicate a reversal before machine learning or the golden slope itself catch it, but these events happen rarely.
Machine Learning: kNN (New Approach)Description:
kNN is a very robust and simple method for data classification and prediction. It is very effective if the training data is large. However, it is distinguished by difficulty at determining its main parameter, K (a number of nearest neighbors), beforehand. The computation cost is also quite high because we need to compute distance of each instance to all training samples. Nevertheless, in algorithmic trading KNN is reported to perform on a par with such techniques as SVM and Random Forest. It is also widely used in the area of data science.
The input data is just a long series of prices over time without any particular features. The value to be predicted is just the next bar's price. The way that this problem is solved for both nearest neighbor techniques and for some other types of prediction algorithms is to create training records by taking, for instance, 10 consecutive prices and using the first 9 as predictor values and the 10th as the prediction value. Doing this way, given 100 data points in your time series you could create 10 different training records. It's possible to create even more training records than 10 by creating a new record starting at every data point. For instance, you could take the first 10 data points and create a record. Then you could take the 10 consecutive data points starting at the second data point, the 10 consecutive data points starting at the third data point, etc.
By default, shown are only 10 initial data points as predictor values and the 6th as the prediction value.
Here is a step-by-step workthrough on how to compute K nearest neighbors (KNN) algorithm for quantitative data:
1. Determine parameter K = number of nearest neighbors.
2. Calculate the distance between the instance and all the training samples. As we are dealing with one-dimensional distance, we simply take absolute value from the instance to value of x (| x – v |).
3. Rank the distance and determine nearest neighbors based on the K'th minimum distance.
4. Gather the values of the nearest neighbors.
5. Use average of nearest neighbors as the prediction value of the instance.
The original logic of the algorithm was slightly modified, and as a result at approx. N=17 the resulting curve nicely approximates that of the sma(20). See the description below. Beside the sma-like MA this algorithm also gives you a hint on the direction of the next bar move.
Machine Learning & Optimization Moving Average (Expo)█ An indicator that finds the best moving average
We all know that the market change in characteristics over time, volatility, volume, momentum, etc., keep changing. Therefore, traders fine-tune their indicators and strategies to fit the constantly changing market. Unfortunately, that means there is no "best" MA period that suits all these conditions. That is why we have developed this algorithm that self-adapts and finds the best MA period based on Machine Learning and Optimization calculations.
This indicator help traders and investors to use the best possible moving average period on the selected timeframe and asset and ensures that the period is updated even though the market characteristics change over time.
█ Self-optimizing moving average
There is no doubt that different markets and timeframes need different MA periods. Therefore, our algorithm optimizes the moving average period within the given parameter range and optimizes its value based on either performance, win rate, or the combined results. The moving average period updates automatically on the chart for you.
Traders can choose to use our Machine Learning Algorithm to optimize the MA values or can optimize only using the optimization algorithm.
Performance
If you select to optimize based on performance, the calculation returns the period with the highest gains.
Winrate
If you select to optimize based on win rate, the calculation returns the period that gives the best win rate.
Combined
If you select to optimize based on combined results, the calculations score the performance and win rate separately and choose the best period with the highest ranking in both aspects.
█ Finding the best moving average for any asset and timeframe
Traders can choose to find the best moving average based on price crossings.
█ Finding the best combination of moving averages for any asset and timeframe
Traders can choose to find the best crossing strategy, where the algorithm compares the 2 averages and returns the best fast and slow period.
█ Alerts
Traders can choose to be alerted when a new best moving average is found or when a moving average cross occurs.
-----------------
Disclaimer
The information contained in my Scripts/Indicators/Ideas/Algos/Systems does not constitute financial advice or a solicitation to buy or sell any securities of any type. I will not accept liability for any loss or damage, including without limitation any loss of profit, which may arise directly or indirectly from the use of or reliance on such information.
All investments involve risk, and the past performance of a security, industry, sector, market, financial product, trading strategy, backtest, or individual's trading does not guarantee future results or returns. Investors are fully responsible for any investment decisions they make. Such decisions should be based solely on an evaluation of their financial circumstances, investment objectives, risk tolerance, and liquidity needs.
My Scripts/Indicators/Ideas/Algos/Systems are only for educational purposes!
Esqvair's Neural Reversal Probability IndicatorIntroduction
Esqvair's Neural Reversal Probability Indicator is the indicator that shows probability of reversal.
Warning: This script should only be used on 1 minute chart.
How to use
When a signal appears (by default it is a green bar), a reversal should be expected.
The signal appears when the indicator value >= Threshold.
If you want more signals, you must lower the threshold, if less, you must increase the threshold.
For some assets, like Forex pairs, you have to optimize the threshold yourself, but for most stocks, the default threshold works well.
How well a threshold fits an asset depends on the volatility of the asset.
For most assets, the indicator ranges from 35 to 75.
Settings
Smoothing - The default is 1, which means no smoothing. Indicator smoothing by SMA.
Threshold - default 71.0 is responsible for the occurrence of signals, read "How to use" part to learn more
The Indicator
This indicator is a pre-trained neural network that was trained outside of TradingView and then its structure and weights values were converted to PineScript.
Warning: A neural network is a black box in the sense that although it can approximate any function, studying its structure will not give you any idea about the structure of the function being approximated.
Possible questions
Why does the indicator value most time range from 35 to 75 when the probability should ranges from 0 to 100?
-Due to some randomness in the markets, a neural network can never be 100% sure.
What data was used to train the neural network?
-This was BTCUSD 1 minute chart data from 02/05/2020 to 02/05/2022.
Where did you train the neural network and convert it to PineScript?
-I used a programming language that I know.
Tesla Coil MLThis is a re-implementation of @veryfid's wonderful Tesla Coil indicator to leverage basic Machine Learning Algorithms to help classify coil crossovers. The original Tesla Coil indicator requires extensive training and practice for the user to develop adequate intuition to interpret coil crossovers. The goal for this version is to help the user understand the underlying logic of the Tesla Coil indicator and provide a more intuitive way to interpret the indicator. The signals should be interpreted as suggestions rather than as a hard-coded set of rules.
NOTE: Please do NOT trade off the signals blindly. Always try to use your own intuition for understanding the coils and check for confluence with other indicators before initiating a trade.
Mean Shift Pivot ClusteringCore Concepts
According to Jeff Greenblatt in his book "Breakthrough Strategies for Predicting Any Market", Fibonacci and Lucas sequences are observed repeated in the bar counts from local pivot highs/lows. They occur from high to high, low to high, high to low, or low to high. Essentially, this phenomenon is observed repeatedly from any pivot points on any time frame. Greenblatt combines this observation with Elliott Waves to predict the price and time reversals. However, I am no Elliottician so it was not easy for me to use this in a practical manner. I decided to only use the bar count projections and ignore the price. I projected a subset of Fibonacci and Lucas sequences along with the Fibonacci ratios from each pivot point. As expected, a projection from each pivot point resulted in a large set of plotted data and looks like a huge gong show of lines. Surprisingly, I did notice clusters and have observed those clusters to be fairly accurate.
Fibonacci Sequence: 1, 2, 3, 5, 8, 13, 21, 34...
Lucas Sequence: 2, 1, 3, 4, 7, 11, 18, 29, 47...
Fibonacci Ratios (converted to whole numbers): 23, 38, 50, 61, 78, 127, 161...
Light Bulb Moment
My eyes may suck at grouping the lines together but what about clustering algorithms? I chose to use a gimped version of Mean Shift because it doesn't require me to know in advance how many lines to expect like K-Means. Mean shift is computationally expensive and with Pinescript's 500ms timeout, I had to make due without the KDE. In other words, I skipped the weighting part but I may try to incorporate it in the future. The code is from Harrison Kinsley . He's a fantastic teacher!
Usage
Search Radius: how far apart should the bars be before they are excluded from the cluster? Try to stick with a figure between 1-5. Too large a figure will give meaningless results.
Pivot Offset: looks left and right X number of bars for a pivot. Same setting as the default TradingView pivot high/low script.
Show Lines Back: show historical predicted lines. (These can change)
Use this script in conjunction with Fibonacci price retracement/extension levels and/or other support/resistance levels. If it's no where near a support/resistance and there's a projected time pivot coming up, it's probably a fake out.
Notes
Re-painting is intended. When a new pivot is found, it will project out the Fib/Lucas sequences so the algorithm will run again with additional information.
The script is for informational and educational purposes only.
Do not use this indicator by itself to trade!
Unreal Algo [UPRIGHT] (cc)Hello Traders,
It's finally that time, I'm releasing my baby out into the world.
Unreal Algo is the answer to the question you didn't know you were asking.
It's for beginners and advanced traders alike. I've made the settings very customizable, but also easy to just jump right in.
How it works:
It uses tons of calculations, confirmations, and filters to bring you the most accurate predictive algorithm possible. The algo will automatically adjust to different volatility in the market to still provide accurate signals and confirmation. It will automatically show support and resistance in real-time. A Moving Average cloud with speeds varying from extra fast to slow; they will help traders confirm whether they should stay in the trade. Also, I added 2 stoplosses, because the importance of risk management should always be emphasized even with strong accuracy.
Features:
---The Most Accurate Signals on the planet.
--------Buy/Sell, Up/Down direction change, and Red/Green arrows.
--- MA cloud with beautiful color blend that can act as a confirmation of direction.
-------- 17 different types/versions of moving Averages to choose from.
--------Easy line transparency and toggle adjustments.
--------Easy cloud transparency adjustments.
--- Support and Resistance .
--- Advanced PSAR that will show red when bearish while in a bullish trend, and visa-versa.
---Potential Orderblocks that can be extended to show a grid (adding additional support/resistance information).
--- Fibonacci Lines.
--- Pivot bar that changes colors based on pivot direction.
---Resistance Breakout and Support Breakdown Signals .
--- Relative volume & momentum bar coloring.
---Two Separate Stoplosses .
--------Circles change color and flip to top and red for Short, bottom and green for long.
--------Horizontal stoploss that tracks the price and flags to take profit. White for Long and Yellow for short.
---As always... Fully customizable .
Different customization options:
Without stoplosses and Support/Resistance.
Without Support/Resistance, arrows and psar removed.
Added back Support/Resistance, lightened MA cloud
Fully loaded (minus trailing stoploss)
[UPRIGHT Trading] MoneyFlowTrend Oscillator(cc) PremiumHey Traders,
Tonight I'm updating my beloved original MoneyFlowTrend Oscillator with a Premium version.
A little background:
This is an indicator that I've been working to bring to life for years; learning pinescript code has allowed me to do just that.
Built on the idea of Supply & Demand Zones, this utilizes money flow and numerous calculations to create a picture of what is happening underneath the surface of the price action.
Richard Wykoff was one of the first market analysts to explain how the economic cycle can be applied to explain market price action; thus, technical analysis . He described two zones among the total of 4 phases; the two zones are Distribution and Accumulation zones, also known as Supply & Demand zones.
______________________________
Since most of you already know the economic cycle, I will try to be concise.
The basic ideas:
When supply > demand, the price goes up down.
When demand > supply, price goes up.
When demand = supply, the price stays about the same (going sideways).
Price action has --Uptrends, downtrends, and price ranges (consolidation).
Wykoff's 4 phases to explain this price action :
1) Accumulation (Demand zone)
2) Markup (Uptrend)
3) Distribution (Supply zone)
4) Markdown (Downtrend)
______________________________
With all that said, usually you will either see a sharp jump from a supply or demand zone or it will consolidate within it. Until a new one is formed on the chart.
This indicator attempts to put all of that into a lower indicator. I tried to separate the retailers and the banks and then put them back together to get a full picture.
Premium:
-Even MORE (quality & quantity) Accurate signals.
-Reversal Signal added (Circle- shown on chart)
-Cleaner Scaling and Organization.
The chart shown above should look like this:
Good luck traders.
Cheers,
Mike
(UPRIGHT Trading)
WIPNNetworkLibrary "WIPNNetwork"
this is a work in progress (WIP) and prone to have some errors, so use at your own risk...
let me know if you find any issues..
Method for a generalized Neural Network.
network(x) Generalized Neural Network Method.
Parameters:
x : TODO: add parameter x description here
Returns: TODO: add what function returns
FunctionNNLayerLibrary "FunctionNNLayer"
Generalized Neural Network Layer method.
function(inputs, weights, n_nodes, activation_function, bias, alpha, scale) Generalized Layer.
Parameters:
inputs : float array, input values.
weights : float array, weight values.
n_nodes : int, number of nodes in layer.
activation_function : string, default='sigmoid', name of the activation function used.
bias : float, default=1.0, bias to pass into activation function.
alpha : float, default=na, if required to pass into activation function.
scale : float, default=na, if required to pass into activation function.
Returns: float
FunctionNNPerceptronLibrary "FunctionNNPerceptron"
Perceptron Function for Neural networks.
function(inputs, weights, bias, activation_function, alpha, scale) generalized perceptron node for Neural Networks.
Parameters:
inputs : float array, the inputs of the perceptron.
weights : float array, the weights for inputs.
bias : float, default=1.0, the default bias of the perceptron.
activation_function : string, default='sigmoid', activation function applied to the output.
alpha : float, default=na, if required for activation.
scale : float, default=na, if required for activation.
@outputs float
MLActivationFunctionsLibrary "MLActivationFunctions"
Activation functions for Neural networks.
binary_step(value) Basic threshold output classifier to activate/deactivate neuron.
Parameters:
value : float, value to process.
Returns: float
linear(value) Input is the same as output.
Parameters:
value : float, value to process.
Returns: float
sigmoid(value) Sigmoid or logistic function.
Parameters:
value : float, value to process.
Returns: float
sigmoid_derivative(value) Derivative of sigmoid function.
Parameters:
value : float, value to process.
Returns: float
tanh(value) Hyperbolic tangent function.
Parameters:
value : float, value to process.
Returns: float
tanh_derivative(value) Hyperbolic tangent function derivative.
Parameters:
value : float, value to process.
Returns: float
relu(value) Rectified linear unit (RELU) function.
Parameters:
value : float, value to process.
Returns: float
relu_derivative(value) RELU function derivative.
Parameters:
value : float, value to process.
Returns: float
leaky_relu(value) Leaky RELU function.
Parameters:
value : float, value to process.
Returns: float
leaky_relu_derivative(value) Leaky RELU function derivative.
Parameters:
value : float, value to process.
Returns: float
relu6(value) RELU-6 function.
Parameters:
value : float, value to process.
Returns: float
softmax(value) Softmax function.
Parameters:
value : float array, values to process.
Returns: float
softplus(value) Softplus function.
Parameters:
value : float, value to process.
Returns: float
softsign(value) Softsign function.
Parameters:
value : float, value to process.
Returns: float
elu(value, alpha) Exponential Linear Unit (ELU) function.
Parameters:
value : float, value to process.
alpha : float, default=1.0, predefined constant, controls the value to which an ELU saturates for negative net inputs. .
Returns: float
selu(value, alpha, scale) Scaled Exponential Linear Unit (SELU) function.
Parameters:
value : float, value to process.
alpha : float, default=1.67326324, predefined constant, controls the value to which an SELU saturates for negative net inputs. .
scale : float, default=1.05070098, predefined constant.
Returns: float
exponential(value) Pointer to math.exp() function.
Parameters:
value : float, value to process.
Returns: float
function(name, value, alpha, scale) Activation function.
Parameters:
name : string, name of activation function.
value : float, value to process.
alpha : float, default=na, if required.
scale : float, default=na, if required.
Returns: float
derivative(name, value, alpha, scale) Derivative Activation function.
Parameters:
name : string, name of activation function.
value : float, value to process.
alpha : float, default=na, if required.
scale : float, default=na, if required.
Returns: float
MLLossFunctionsLibrary "MLLossFunctions"
Methods for Loss functions.
mse(expects, predicts) Mean Squared Error (MSE) " MSE = 1/N * sum ((y - y')^2) ".
Parameters:
expects : float array, expected values.
predicts : float array, prediction values.
Returns: float
binary_cross_entropy(expects, predicts) Binary Cross-Entropy Loss (log).
Parameters:
expects : float array, expected values.
predicts : float array, prediction values.
Returns: float
M.Right Awesome RSI+ (cc)Hey Traders,
Tonight I figured I'd release a special indicator that I've had in the works for years and finally was able to piece it together using pine. It's an extremely accurate take on the RSI. I plan to continue to refine the indicator and add more features, but as it is this is still one you can make a lot of money with.
(((((Please note: all circles and arrows in the chart above are drawn for illustration. Below is a chart showing regular session)))))
This indicator will act similarly to a regular RSI (Relative Strength Indicator) in that there are Oversold and Overbought levels, but also volatility bands around it to allow for more accurate signals whilst moving the Oversold (OS) and Overbought (OB) levels further apart ( less false OB/OS signals ). As shown in the chart above, it's able to detect some pretty big moves with both speed and accuracy .
Most of you are familiar with and use an RSI indicator so I will keep this description as brief as possible: The Relative Strength Index (RSI), developed by the legendary J. Welles Wilder, is a momentum oscillator that measures the speed and change of price movements; it oscillates between 0 - 100, with levels set as Overbought and Oversold. These levels are where a trader make look for a reversal, however they must keep in mind in an uptrend or bull market, the RSI tends to remain in the 40 - 90 range; 40 - 50 zone often will act as support. More advanced traders will also look for divergences between the price and the oscillator (i.e. price trending upward while oscillator trending downward). As far as oscillators go, the RSI is one of the most frequently used, by both advanced and beginner traders alike.
Works great on multiple timeframes. It may not catch every rally, but it will catch most --even on smaller timeframes (i.e. 5 minutes in image below).
As with all of my scripts I like to make them customizable:
You can change the up and down colors on the RSI ribbons and the color and style (dotted shown) of Overbought / Oversold lines. In future versions, I will add more color customizations and additions.
Can toggle 1 or both of the 2 highlight signals off to make it a little more plain.
Lot's of ways to make it look the way you'd like it to.
--The alerts include both the super accurate Bullish and Bearish signals shown with the background highlights. They are pre-filled so it will automatically display the price and time that the alert went off for you.
If I missed anything or you have a question, please let me know!
Cheers,
Mike
Please note: I have made this indicator invite only, send me a DM if you're interested in trying it out.
CryptoNite - Machine Learning Strategy (15Min Timeframe)Greeting Traders! I am back with another ML strategy. :D I kept my word with combining my machine learning algorithms from Python and integrating them into Tradingview. Thanks to Tradingview's new release of Pinescript v5 it is now possible. This strategy respects the Sortino Ratio and was created using 2 years of data for 50 different cryptocurrencies. That is a total of 100 years of data and 44,849 trades to create this strategy. Now let me tell you, my computer and I are exhausted. We both been at it non-stop for about two months everyday. I refine the strategy, and the computer runs 24/7 for a few days to spit out the best results into the terminal. It's been a good run so my computer will finally get some sleep tonight.
So let's talk a little about the features of the strategy. In the settings window, you'll see the Stoploss, Take Profit Parameters, and Date Range. You can change the Date Range, but I recommend to leave the SL/TP parameters how they are because the machine learning algo chose those input. If you wish to change them you are always welcome to do so but backtest results will change. For the Take Profit parameters you'll see on the left side you something labeled time duration(displayed in minutes) and on the right side you'll see take profit values. Let's talk a little bit how they work.
TP_values = {
"0": 0.102,
"133": 0.051,
"431": 0.039,
"963": 0
}
In python, the table looks like this but it is quite easy to understand in Tradingview.
From 0-133 minutes, the strategy is looking to the reach target point 1 at 10.2% profit.
From 133-431 minutes, the strategy is looking to the reach target point 2 at 5.1% profit.
From 431-963 minutes, the strategy is looking to the reach target point 3 at 3.9% profit.
From 963+ minutes, the strategy is looking to break even at 0% profit on target point 4.
Through each target point a sell trigger is active. It will look for the best time to sell even if TP has not been reached.
This helps the trade not stay open too long.
The last thing I need to mention is the textbox displayed on the right side of your chart. This textbox displays the current Take Profit value in dollar amount. So when you're in a trade you'll know what TP target has to be reached when the open trade is active. Throughout time, the target price changes depending how long the trade has been open. If you have any questions feel free to comment down below, and enjoy this strategy!
Financial Astrology Indexes ML Daily TrendDaily trend indicator based on financial astrology cycles detected with advanced machine learning techniques for some of the most important market indexes: DJI, UK100, SPX, IBC, IXIC, NI225, BANKNIFTY, NIFTY and GLD fund (not index) for Gold predictions. The daily price trend is forecasted through planets cycles (angular aspects, speed phases, declination zone), fast cycles are based on Moon, Mercury, Venus and Sun and Mid term cycles are based on Mars, Vesta and Ceres . The combination of all this cycles produce a daily price trend prediction that is encoded into a PineScript array using binary format "0 or 1" that represent sell and buy signals respectively. The indicator provides signals since 2021-01-01 to 2022-12-31, the past months signals purpose is to support backtesting of the indicator combined with other technical indicator entries like MAs, RSI or Stochastic . For future predictions besides 2022 a machine learning models re-train phase will be required.
When the signal moving average is increasing from 0 to 1 indicates an increase of buy force, when is decreasing from 1 to 0 indicates an increase in sell force, finally, when is sideways around the 0.4-0.6 area predicts a period of buy/sell forces equilibrium, traders indecision which result in a price congestion within a narrow price range.
We also have published same indicator for Crypto-Currencies research portfolio:
DISCLAIMER: This indicator is experimental and don’t provide financial or investment advice, the main purpose is to demonstrate the predictive power of financial astrology. Any allocation of funds following the documented machine learning model prediction is a high-risk endeavour and it’s the users responsibility to practice healthy risk management according to your situation.
Financial Astrology Crypto ML Daily TrendThis daily trend indicator is based on financial astrology cycles detected with advanced machine learning techniques for the crypto-currencies research portfolio: ADA, BAT, BNB, BTC, DASH, EOS, ETC, ETH, LINK, LTC, XLM, XMR, XRP, ZEC and ZRX. The daily price trend is forecasted through this planets cycles (angular aspects, speed, declination), fast ones are based on Moon, Mercury, Venus and Sun and Mid term cycles are based on Mars, Vesta and Ceres. The combination of all this cycles produce a daily price trend prediction that is encoded into a PineScript array using binary format "0 or 1" that represent sell and buy signals respectively. The indicator provides signals since 2021-01-01 to 2022-12-31, the past months signals purpose is to support backtesting of the indicator combined with other technical indicator entries like MAs, RSI or Stochastic. For future predictions besides 2022 a machine learning models re-train phase will be required.
The resolution of this indicator is 1D, you can tune a parameter where you can determine how many future bars of daily trend are plotted and adjust an hours shift to anticipate future signals into current bar in order to produce a leading indicator effect to anticipate the trend changes with some hours of anticipation. Combined with technical analysis indicators this daily trend is very powerful because can help to produce approximately 60% of profitable signals based on the backtesting results. You can look at our open source Github repositories to validate accuracy using the backtesting strategies we have implemented in Jesse Crypto Trading Framework as proof of concept of the predictive potential of this indicator. Alternatively, we have implemented a PineScript strategy that use this indicator, just consider that we are pending to do signals update to the period July 2021 to December 2022: This strategy have accumulated more than 110 likes and many traders have validated the predictive power of Financial Astrology.
DISCLAIMER: This indicator is experimental and don’t provide financial or investment advice, the main purpose is to demonstrate the predictive power of financial astrology. Any allocation of funds following the documented machine learning model prediction is a high-risk endeavour and it’s the users responsibility to practice healthy risk management according to your situation.
VWMA with kNN Machine Learning: MFI/ADXThis is an experimental strategy that uses a Volume-weighted MA (VWMA) crossing together with Machine Learning kNN filter that uses ADX and MFI to predict, whether the signal is useful. k-nearest neighbours (kNN) is one of the simplest Machine Learning classification algorithms: it puts input parameters in a multidimensional space, and then when a new set of parameters are given, it makes a prediction based on plurality vote of its k neighbours.
Money Flow Index (MFI) is an oscillator similar to RSI, but with volume taken into account. Average Directional Index (ADX) is an indicator of trend strength. By putting them together on two-dimensional space and checking, whether nearby values have indicated a strong uptrend or downtrend, we hope to filter out bad signals from the MA crossing strategy.
This is an experiment, so any feedback would be appreciated. It was tested on BTC/USDT pair on 5 minute timeframe. I am planning to expand this strategy in the future to include more moving averages and filters.
Morun Astro Trend MAs cross StrategyAstrology machine learning cycles indicator signals with technical MAs indicators strategy, based on signals index of Github project github.com
Machine Learning: kNN-based Strategy (update)kNN-based Strategy (FX and Crypto)
Description:
This update to the popular kNN-based strategy features:
improvements in the business logic,
an adjustible k value for the kNN model,
one more feature (MOM),
a streamlined signal filter and
some other minor fixes.
Now this script works in all timeframes !
I intentionally decided to publish this script separately
in order for the users to see the differences.