Double AI Super Trend Trading - Strategy [PresentTrading]█ Introduction and How It is Different
The Double AI Super Trend Trading Strategy is a cutting-edge approach that leverages the power of not one, but two AI algorithms, in tandem with the SuperTrend technical indicator. The strategy aims to provide traders with enhanced precision in market entry and exit points. It is designed to adapt to market conditions dynamically, offering the flexibility to trade in both bullish and bearish markets.
*The KNN part is mainly referred from @Zeiierman.
BTCUSD 8hr performance
ETHUSD 8hr performance
█ Strategy, How It Works: Detailed Explanation
1. SuperTrend Calculation
The SuperTrend is a popular indicator that captures market trends through a combination of the Volume-Weighted Moving Average (VWMA) and the Average True Range (ATR). This strategy utilizes two sets of SuperTrend calculations with varying lengths and factors to capture both short-term and long-term market trends.
2. KNN Algorithm
The strategy employs k-Nearest Neighbors (KNN) algorithms, which are supervised machine learning models. Two sets of KNN algorithms are used, each focused on different lengths of historical data and number of neighbors. The KNN algorithms classify the current SuperTrend data point as bullish or bearish based on the weighted sum of the labels of the k closest historical data points.
3. Signal Generation
Based on the KNN classifications and the SuperTrend indicator, the strategy generates signals for the start of a new trend and the continuation of an existing trend.
4. Trading Logic
The strategy uses these signals to enter long or short positions. It also incorporates dynamic trailing stops for exit conditions.
Local picture
█ Trade Direction
The strategy allows traders to specify their trading direction: long, short, or both. This enables the strategy to be versatile and adapt to various market conditions.
█ Usage
ToolTips: Comprehensive tooltips are provided for each parameter to guide the user through the customization process.
Inputs: Traders can customize numerous parameters including the number of neighbors in KNN, ATR multiplier, and types of moving averages.
Plotting: The strategy also provides visual cues on the chart to indicate bullish or bearish trends.
Order Execution: Based on the generated signals, the strategy will execute buy or sell orders automatically.
█ Default Settings
The default settings are configured to offer a balanced approach suitable for most scenarios:
Initial Capital: $10,000
Default Quantity Type: 10% of equity
Commission: 0.1%
Slippage: 1
Currency: USD
These settings can be modified to suit various trading styles and asset classes.
Recherche dans les scripts pour "algo"
Fibonacci Structure & Trend Channel (Expo)█ Overview
The Fibonacci Structure & Trend Channel (Expo) is designed to identify trend direction and potential reversal levels and offer insights into price structure based on Fibonacci ratios. The algorithm plots a Fibonacci channel, making it easier for traders to identify potential retracement points. Additionally, the Fibonacci market structure is plotted to enhance traders' understanding of the underlying order flow.
█ How to Use
Identify Trends
Use the plotted Fibonacci Trend Line to identify the direction of the market trend. A green line typically signifies a bullish trend, while a red line signifies a bearish trend.
Retracement Levels
The plotted Fibonacci levels can act as potential support or resistance levels. Look for price action signs at these levels for entry or exit points.
Channel Trading
If you enable the Fibonacci channel, the upper and lower bounds can act as overbought or oversold levels.
Market Structure
The plotted Fibonacci market structure serves as a valuable tool for dissecting the underlying order flow and gauging the strength or weakness of a trend. By analyzing these structures, traders can identify key levels where supply and demand intersect, which often act as pivotal points for trend reversals or accelerations. This visual representation simplifies complex market dynamics. Whether you're looking to catch a new trend early or seeking confirmation for a potential reversal, understanding the market structure plotted by the Fibonacci ratios can provide actionable insights for various trading strategies.
Use the Table
The information table can provide quick insights into the current trend and when it started.
█ Settings
The Fibonacci settings allow traders to specify the Fibonacci retracement levels that will be used to calculate the trend and its channel.
The Fibonacci Structure Trend Channel structure settings enable traders to fine-tune how the indicator identifies and plots the underlying price structure.
-----------------
Disclaimer
The information contained in my Scripts/Indicators/Ideas/Algos/Systems does not constitute financial advice or a solicitation to buy or sell any securities of any type. I will not accept liability for any loss or damage, including without limitation any loss of profit, which may arise directly or indirectly from the use of or reliance on such information.
All investments involve risk, and the past performance of a security, industry, sector, market, financial product, trading strategy, backtest, or individual's trading does not guarantee future results or returns. Investors are fully responsible for any investment decisions they make. Such decisions should be based solely on an evaluation of their financial circumstances, investment objectives, risk tolerance, and liquidity needs.
My Scripts/Indicators/Ideas/Algos/Systems are only for educational purposes!
Filtered Volume Profile [ChartPrime]The "Filtered Volume Profile" is a powerful tool that offers insights into market activity. It's a technical analysis tool used to understand the behavior of financial markets. It uses a fixed range volume profile to provide a histogram representing how much volume occurred at distinct price levels.
Profile in action with various significant levels displayed
How to Use
The script is designed to analyze cumulative trading volumes in different price bins over a certain period, also known as `'lookback'`. This lookback period can be defined by the user and it represents the number of bars to look back for calculating levels of support and resistance.
The `'Smoothing'` input determines the degree to which the output is smoothed. Higher values lead to smoother results but may impede the responsiveness of the indicator to rapid changes in volatility.
The `'Peak Sensitivity'` input is used to adjust the sensitivity of the script's peak detection algorithm. Setting this to a lower value makes the algorithm more sensitive to local changes in trading volume and may result in "noisier" outputs.
The `'Peak Threshold'` input specifies the number of bins that the peak detection mechanism should account for. Larger numbers imply that more volume bins are taken into account, and the resultant peaks are based on wider intervals.
The `'Mean Score Length'` input is used for scaling the mean score range. This is particularly important in defining the length of lookback bars that will be used to calculate the average close price.
Sinc Filter
The application of the sinc-filter to the Filtered Volume Profile reduces the risk of viewing artefacts that may misrepresent the underlying market behavior. Sinc filtering is a high-quality and sharp filter that doesn't manifest any ringing effects, making it an optimal choice for such volume profiling.
Histogram
On the histogram, the volume profile is colored based on the balance of bullish to bearish volume. If a particular bar is more intense in color, it represents a larger than usual volume during a single price bar. This is a clear signal of a strong buying or selling pressure at a particular price level.
Threshold for Peaks
The `peak_thresh` input determines the number of bins the algorithm takes in account for the peak detection feature. The 'peak' represents the level where a significant amount of volume trading has occurred, and usually is of interest as an indicative of support or resistance level.
By increasing the `peak_thresh`, you're raising the bar for what the algorithm perceives as a peak. This could result in fewer, but more significant peaks being identified.
History of Volume Profiles and Evolution into Sinc Filtering
Volume profiling has a rich history in market analysis, dating back to the 1950s when Richard D. Wyckoff, a legendary trader, introduced the concept of volume studies. He understood the critical significance of volume and its relationship with market price movement. The core of Wyckoff's technical analysis suite was the relationship between prices and volume, often termed as "Effort vs Results".
Moving forward, in the early 1800s, the esteemed mathematician J. R. Carson made key improvements to the sinc function, which formed the basis for sinc filtering application in time series data. Following these contributions, trading studies continued to create and integrate more advanced statistical measures into market analysis.
This culminated in the 1980s with J. Peter Steidlmayer’s introduction of Market Profile. He suggested that markets were a function of continuous two-way auction processes thus introducing the concept of viewing markets in price/time continuum and price distribution forms. Steidlmayer's Market Profile was the first wide-scale operation of organized volume and price data.
However, despite the introduction of such features, challenges in the analysis persisted, especially due to noise that could misinform trading decisions. This gap has given rise to the need for smoothing functions to help eliminate the noise and better interpret the data. Among such techniques, the sinc filter has become widely recognized within the trading community.
The sinc filter, because of its properties of constructing a smooth passing through all data points precisely and its ability to eliminate high-frequency noise, has been considered a natural transition in the evolution of volume profile strategies. The superior ability of the sinc filter to reduce noise and shield against over-fitting makes it an ideal choice for smoothing purposes in trading scripts, particularly where volume profiling forms the crux of the market analysis strategy, such as in Filtered Volume Profile.
Moving ahead, the use of volume-based studies seems likely to remain a core part of technical analysis. As long as markets operate based on supply and demand principles, understanding volume will remain key to discerning the intent behind price movements. And with the incorporation of advanced methods like sinc filtering, the accuracy and insight provided by these methodologies will only improve.
Mean Score
The mean score in the Filtered Volume Profile script plays an important role in probabilistic inferences regarding future price direction. This score essentially characterizes the statistical likelihood of price trends based on historical data.
The mean score is calculated over a configurable `'Mean Score Length'`. This variable sets the window or the timeframe for calculation of the mean score of the closing prices.
Statistically, this score takes advantage of the concept of z-scores and probabilities associated with the t-distribution (a type of probability distribution that is symmetric and bell-shaped, just like the standard normal distribution, but has heavier tails).
The z-score represents how many standard deviations an element is from the mean. In this case, the "element" is the price level (Point of Control).
The mean score section of the script calculates standard errors for the root mean squared error (RMSE) and addresses the uncertainty in the prediction of the future value of a random variable.
The RMSE of a model prediction concerning observed values is used to measure the differences between values predicted by a model and the values observed.
The lower the RMSE, the better the model is able to predict. A zero RMSE means a perfect fit to the data. In essence, it's a measure of how concentrated the data is around the line of best fit.
Through the mean score, the script effectively predicts the likelihood of the future close price being above or below our identified price level.
Summary
Filtered Volume Profile is a comprehensive trading view indicator which utilizes volume profiling, peak detection, mean score computations, and sinc-filter smoothing, altogether providing the finer details of market behavior.
It offers a customizable look back period, smoothing options, and peak sensitivity setting along with a uniquely set peak threshold. The application of the Sinc Filter ensures a high level of accuracy and noise reduction in volume profiling, making this script a reliable tool for gaining market insights.
Furthermore, the use of mean score calculations provides probabilistic insights into price movements, thus providing traders with a statistically sound foundation for their trading decisions. As trading markets advance, the use of such methodologies plays a pivotal role in formulating effective trading strategies and the Filtered Volume Profile is a successful embodiment of such advancements in the field of market analysis.
RibboNN Machine Learning [ChartPrime]The RibboNN ML indicator is a powerful tool designed to predict the direction of the market and display it through a ribbon-like visual representation, with colors changing based on the prediction outcome from a conditional class. The primary focus of this indicator is to assist traders in trend following trading strategies.
The RibboNN ML in action
Prediction Process:
Conditional Class: The indicator's predictive model relies on a conditional class, which combines information from both longcon (long condition) and short condition. These conditions are determined using specific rules and criteria, taking into account various market factors and indicators.
Direction Prediction: The conditional class provides the basis for predicting the direction of the market move. When the prediction value is greater than 0, it indicates an upward trend, while a value less than 0 suggests a downward trend.
Nearest Neighbor (NN): To attempt to enhance the accuracy of predictions, the RibboNN ML indicator incorporates a Nearest Neighbor algorithm. This algorithm analyzes historical data from the Ribbon ML's predictive model (RMF) and identifies patterns that closely resemble the current conditional prediction class, thereby offering more robust trend forecasts.
Ribbon Visualization:
The Ribbon ML indicator visually represents its predictions through a ribbon-like display. The ribbon changes colors based on the direction predicted by the conditional class. An upward trend is represented by a green color, while a downward trend is depicted by a red color, allowing traders to quickly identify potential market directions.
The introduction of the Nearest Neighbor algorithm provides the Ribbon ML indicator with unique and adaptive behaviors. By dynamically analyzing historical patterns and incorporating them into predictions, the indicator can adapt to changing market conditions and offer more reliable signals for trend following trading strategies.
Manipulation of the NN Settings:
Smaller Value of Neighbours Count:
When the value of "Neighbours Count" is small, the algorithm considers only a few nearest neighbors for making predictions.
A smaller value of "Neighbours Count" leads to more flexible decision boundaries, which can result in a more granular and sensitive model.
However, using a very small value might lead to overfitting, especially if the training data contains noise or outliers.
Larger Value of "Neighbours Count":
When the value of "Neighbours Count" is large, the algorithm considers a larger number of nearest neighbors for making predictions.
A larger value of "Neighbours Count" leads to smoother decision boundaries and helps capture the global patterns in the data.
However, setting a very large value might result in a loss of local patterns and make the model less sensitive to changes in the data.
Price & Volume Profile (Expo)█ Overview
The Price & Volume Profile provides a holistic perspective on market dynamics by simultaneously tracking price action and trading volume across a range of price levels. So it is not only a volume-based indicator but also a price-based one. In addition to illustrating volume distribution, it quantifies how frequently the price has fallen within a particular range, thus offering a holistic perspective on market dynamics.
This unique and comprehensive approach to market analysis by considering both price action and trading volume, two crucial dimensions of market activity. Its distinctive methodology offers several advantages:
Holistic Market View: By simultaneously tracking the frequency of specific price ranges (Price Profile) and the volume traded at those ranges (Volume Profile), this indicator provides a more complete picture of market behavior. It shows not only where the market is trading but also how much it's trading, reflecting both price acceptance levels and market participation intensity.
Point of Control (POC): The POC, as highlighted by this indicator, serves as a significant reference point for traders. It identifies the price level with the highest trading activity, thus indicating a strong consensus among market participants about the asset's fair value. Observing how price interacts with the POC can offer valuable insights into market sentiment and potential trend reversals.
Support and Resistance Levels: Price levels with high trading activity often act as support or resistance in future price movements. The indicator visually represents these levels, enabling traders to anticipate potential price reactions.
Price Profile
Price and Volume Profile
█ Calculations
The algorithm analyzes both trade frequency and volume across different price levels. It identifies these levels within the visible chart range, then examines each bar to determine if the selected price falls within these levels. If so, it increases a counter and adds the trading volume. This process repeats across the visible range and is visualized as a horizontal histogram, each bar representing a price level and the bar length reflecting trade frequency and volume. Additionally, it calculates the Point of Control (POC), signifying the price level with the highest activity.
In summary: The histogram presents a dual perspective - not only the traded volume at each price level but also the frequency of the price hitting each range. The longer the bar, the more times the price has frequented that specific range, revealing key insights into price behavior and acceptance levels. These frequently visited areas often emerge as strong support or resistance zones, helping traders navigate market movements.
Please note that the indicator adjusts to the visible price range, making it adaptable to changing market conditions. This dynamic analysis can provide more relevant and timely information than static indicators.
█ How to use
This indicator is beneficial for traders as it offers insights into the distribution of trading activity across different price levels. It helps identify key areas of support and resistance and gives a visual representation of market sentiment and liquidity.
The point of control (POC) , which is the price level with the highest traded volume or frequency count, becomes even more crucial in this context. It marks the price at which the most trading activity occurred, signaling a strong consensus among market participants about the asset's fair value. If the market price deviates significantly from the POC, it could suggest an overbought or oversold condition, potentially leading to a price reversion.
Fair Price Areas/gaps are specific price levels or zones where an asset has spent limited time in the past. These areas are considered interesting or significant because they may have an impact on future price action.
Similar to the concept of fair value gaps, which refers to discrepancies between an asset's market price and its estimated intrinsic value, Fair Price Areas/gaps focus on price levels that have been relatively underutilized in terms of trading activity. When an asset's price reaches a Fair Price Area/gap, traders and investors pay attention because they expect the price to react in some way. The rationale behind this concept is that price tends to gravitate towards areas where it has spent less time in the past, as the market perceives them as significant levels.
█ Settings
The indicator is customizable, allowing users to define the number of price levels (rows), the offset, the data source, and whether to display volume or frequency count. It also adjusts dynamically to the visible price range on the chart, ensuring that the analysis remains relevant and timely with changing market conditions.
Source: The price to use for the calculation. Typically, this is the closing price. By considering the user-selected Source (typically the closing price), the indicator determines the frequency with which the price lands within each designated price level (row) over the selected period. In essence, the indicator provides a count of bars where the Source price falls within each range, essentially creating a "Price Profile."
Row Size: The number of price levels (rows) to divide the visible price range into.
Display: Choose whether to display the number of bars ("Counter") or the total volume ("Volume") for each price level.
Offset: The distance of the histogram from the price chart.
Point of Control (POC): If enabled, the indicator will highlight the price level with the most activity.
-----------------
Disclaimer
The information contained in my Scripts/Indicators/Ideas/Algos/Systems does not constitute financial advice or a solicitation to buy or sell any securities of any type. I will not accept liability for any loss or damage, including without limitation any loss of profit, which may arise directly or indirectly from the use of or reliance on such information.
All investments involve risk, and the past performance of a security, industry, sector, market, financial product, trading strategy, backtest, or individual's trading does not guarantee future results or returns. Investors are fully responsible for any investment decisions they make. Such decisions should be based solely on an evaluation of their financial circumstances, investment objectives, risk tolerance, and liquidity needs.
My Scripts/Indicators/Ideas/Algos/Systems are only for educational purposes!
Volume Orderbook (Expo)█ Overview
The Volume Orderbook indicator is a volume analysis tool that visually resembles an order book. It's used for displaying trading volume data in a way that may be easier to interpret or more intuitive for certain traders, especially those familiar with order book analysis.
This indicator aggregate and display the total trading volume at different price levels over the entire range of data available on the chart, similar to how an order book displays current buy and sell orders at different price levels. However, unlike a real-time order book, it only considers historical trading data, not current bid and ask orders. This provides a 'historical order book' of sorts, indicating where most trading activities have taken place.
Summary
This is a volume-based indicator that shows the volume traded at specific price levels, highlighting areas of high and low activity.
█ Calculations
The algorithm operates by calculating the cumulative volume traded in each specific price zone within the range of data displayed on the chart. The length of each horizontal bar corresponds to the total volume of trades that occurred within that particular price zone.
In essence, when the price is in a specific zone, the volume is added to the bar representing that zone. A thicker bar implies a larger price zone, meaning that more volume is accumulated within that bar. Therefore, the thickness of the bar visually indicates the amount of trading activity that took place within the associated price zone.
█ How to use
The Volume Orderbook indicator serves as a beneficial tool for traders by identifying key price levels with a significant amount of trading activity. These high-volume areas could represent potential support or resistance levels due to the large number of orders situated there. The indicator's ability to spotlight these zones might be particularly advantageous in pinpointing breakouts or breakdowns when prices move beyond these high-volume regions. Moreover, the indicator could also assist traders in recognizing anomalies, such as when an unusually large volume of trades occurs at unconventional price levels.
Identify Key Price Levels: The indicator highlights high-volume areas where a significant number of trades have occurred, which could act as potential support or resistance levels. This is based on the notion that many traders have established positions at these prices, so these levels may serve as significant areas for market activity in the future.
Volume Nodes: These are the peaks (high-volume areas) and troughs (low-volume areas) seen on the indicator. High-volume nodes represent price levels at which a large amount of volume has been traded, typically areas of strong support or resistance. Conversely, low-volume nodes, where very little volume has been traded, indicate price levels that traders have shown little interest in the past and could potentially act as barriers to price. It's important to note that while high trading volume can imply significant market interest, it doesn't always mean the price will stop or reverse at these levels. Sometimes, prices can quickly move through high-volume areas if there are no current orders (demand) to match with the new orders (supply).
Analyze Market Psychology: The distribution of volume across different price levels can provide insights into the market's psychology, revealing the balance of power between buyers and sellers.
Highlight Potential Reversal Points: The indicator can help identify price levels with high traded volume where the market might be more likely to reverse since these levels have previously attracted significant interest from traders.
Validate Breakouts or Breakdowns: If the price moves convincingly past a high-volume node, it could indicate a strong trend, suggesting a potential breakout or breakdown. Conversely, if the price struggles to move past a high-volume node, it could suggest that the trend is weak and might potentially reverse.
Trade Reversals: High-volume areas could also indicate potential turning points in the market. If the price reaches these levels and then starts to move away, it might suggest a possible price reversal.
Confirm Other Signals: As with all technical indicators, the "Volume Orderbook" should ideally be used in conjunction with other forms of technical and fundamental analysis to confirm signals and increase the odds of successful trades.
Summary
The Volume Orderbook indicator allows traders to identify key price levels, analyze market psychology, highlight potential reversal points, validate breakouts or breakdowns, confirm other trading signals, and anticipate possible trade reversals, thereby serving as a robust tool for trading analysis.
█ Settings
Source: The user can select the source, the default of which is "close." This implies that volume is added to the volume order book when the closing price falls within a specific zone. Users can modify this to any indicator present on their chart. For example, if it's set to an SMA (Simple Moving Average) of 20, the volume will be added to the volume order book when the SMA 20 falls within the specific zone.
Rows and width: These settings allow users to adjust the representation of volume order book zones. "ROWS" pertains to the number of volume order book zones displayed, while "WIDTH" refers to the breadth of each zone.
Table and Grid: These settings allow traders to customize the Volume order-book's position and appearance. By adjusting the "left" parameter, users can shift the position of the Volume order book on the chart; a higher value pushes the order book further to the right. Additionally, users can enable "Table Border" and "Table Grid" options to add gridlines or borders to the Volume order book for easier viewing and interpretation.
-----------------
Disclaimer
The information contained in my Scripts/Indicators/Ideas/Algos/Systems does not constitute financial advice or a solicitation to buy or sell any securities of any type. I will not accept liability for any loss or damage, including without limitation any loss of profit, which may arise directly or indirectly from the use of or reliance on such information.
All investments involve risk, and the past performance of a security, industry, sector, market, financial product, trading strategy, backtest, or individual's trading does not guarantee future results or returns. Investors are fully responsible for any investment decisions they make. Such decisions should be based solely on an evaluation of their financial circumstances, investment objectives, risk tolerance, and liquidity needs.
My Scripts/Indicators/Ideas/Algos/Systems are only for educational purposes!
Pattern Forecast (Expo)█ Overview
The Pattern Forecast indicator is a technical analysis tool that scans historical price data to identify common chart patterns and then analyzes the price movements that followed these patterns. It takes this information and projects it into the future to provide traders with potential price actions that may occur if the same pattern is identified in real-time market data. This projection helps traders to understand the possible outcomes based on the previous occurrences of the pattern, thereby offering a clearer perspective of the market scenario. By analyzing the historical data and understanding the subsequent price movements following the appearance of a specific pattern, the indicator can provide valuable insights into potential future market behavior.
█ Calculations
The indicator works by scanning historical price data for various candlestick patterns. It includes all in-built TradingView patterns, credit to TradingView that has coded them.
Essentially, the indicator takes the historical price moves that followed the pattern to forecast what might happen next.
█ Example
In this example, the algorithm is set to search for the Inverted Hammer Bullish candlestick pattern. If the pattern is found, the historical outcome is then projected into the future. This helps traders to understand how the past pattern evolved over time.
█ How to use
Providing traders with a comprehensive understanding of historical patterns and their implications for future price action allows them to assess the likelihood of specific market scenarios objectively. For example, suppose the pattern forecast indicator suggests that a particular pattern is likely to lead to a bullish move in the market. A trader might consider going long if the same pattern is identified in the real-time market. Similarly, a trader might consider shorting the asset if the indicator suggests a bearish move is likely, if the same pattern is identified in the real-time market.
█ Settings
Pattern
Select the pattern that the indicator should scan for. All inbuilt TradingView patterns can be selected.
Forecast Candles
Number of candles to project into the future.
-----------------
Disclaimer
The information contained in my Scripts/Indicators/Ideas/Algos/Systems does not constitute financial advice or a solicitation to buy or sell any securities of any type. I will not accept liability for any loss or damage, including without limitation any loss of profit, which may arise directly or indirectly from the use of or reliance on such information.
All investments involve risk, and the past performance of a security, industry, sector, market, financial product, trading strategy, backtest, or individual's trading does not guarantee future results or returns. Investors are fully responsible for any investment decisions they make. Such decisions should be based solely on an evaluation of their financial circumstances, investment objectives, risk tolerance, and liquidity needs.
My Scripts/Indicators/Ideas/Algos/Systems are only for educational purposes!
Quinn-Fernandes Fourier Transform of Filtered Price [Loxx]Down the Rabbit Hole We Go: A Deep Dive into the Mysteries of Quinn-Fernandes Fast Fourier Transform and Hodrick-Prescott Filtering
In the ever-evolving landscape of financial markets, the ability to accurately identify and exploit underlying market patterns is of paramount importance. As market participants continuously search for innovative tools to gain an edge in their trading and investment strategies, advanced mathematical techniques, such as the Quinn-Fernandes Fourier Transform and the Hodrick-Prescott Filter, have emerged as powerful analytical tools. This comprehensive analysis aims to delve into the rich history and theoretical foundations of these techniques, exploring their applications in financial time series analysis, particularly in the context of a sophisticated trading indicator. Furthermore, we will critically assess the limitations and challenges associated with these transformative tools, while offering practical insights and recommendations for overcoming these hurdles to maximize their potential in the financial domain.
Our investigation will begin with a comprehensive examination of the origins and development of both the Quinn-Fernandes Fourier Transform and the Hodrick-Prescott Filter. We will trace their roots from classical Fourier analysis and time series smoothing to their modern-day adaptive iterations. We will elucidate the key concepts and mathematical underpinnings of these techniques and demonstrate how they are synergistically used in the context of the trading indicator under study.
As we progress, we will carefully consider the potential drawbacks and challenges associated with using the Quinn-Fernandes Fourier Transform and the Hodrick-Prescott Filter as integral components of a trading indicator. By providing a critical evaluation of their computational complexity, sensitivity to input parameters, assumptions about data stationarity, performance in noisy environments, and their nature as lagging indicators, we aim to offer a balanced and comprehensive understanding of these powerful analytical tools.
In conclusion, this in-depth analysis of the Quinn-Fernandes Fourier Transform and the Hodrick-Prescott Filter aims to provide a solid foundation for financial market participants seeking to harness the potential of these advanced techniques in their trading and investment strategies. By shedding light on their history, applications, and limitations, we hope to equip traders and investors with the knowledge and insights necessary to make informed decisions and, ultimately, achieve greater success in the highly competitive world of finance.
█ Fourier Transform and Hodrick-Prescott Filter in Financial Time Series Analysis
Financial time series analysis plays a crucial role in making informed decisions about investments and trading strategies. Among the various methods used in this domain, the Fourier Transform and the Hodrick-Prescott (HP) Filter have emerged as powerful techniques for processing and analyzing financial data. This section aims to provide a comprehensive understanding of these two methodologies, their significance in financial time series analysis, and their combined application to enhance trading strategies.
█ The Quinn-Fernandes Fourier Transform: History, Applications, and Use in Financial Time Series Analysis
The Quinn-Fernandes Fourier Transform is an advanced spectral estimation technique developed by John J. Quinn and Mauricio A. Fernandes in the early 1990s. It builds upon the classical Fourier Transform by introducing an adaptive approach that improves the identification of dominant frequencies in noisy signals. This section will explore the history of the Quinn-Fernandes Fourier Transform, its applications in various domains, and its specific use in financial time series analysis.
History of the Quinn-Fernandes Fourier Transform
The Quinn-Fernandes Fourier Transform was introduced in a 1993 paper titled "The Application of Adaptive Estimation to the Interpolation of Missing Values in Noisy Signals." In this paper, Quinn and Fernandes developed an adaptive spectral estimation algorithm to address the limitations of the classical Fourier Transform when analyzing noisy signals.
The classical Fourier Transform is a powerful mathematical tool that decomposes a function or a time series into a sum of sinusoids, making it easier to identify underlying patterns and trends. However, its performance can be negatively impacted by noise and missing data points, leading to inaccurate frequency identification.
Quinn and Fernandes sought to address these issues by developing an adaptive algorithm that could more accurately identify the dominant frequencies in a noisy signal, even when data points were missing. This adaptive algorithm, now known as the Quinn-Fernandes Fourier Transform, employs an iterative approach to refine the frequency estimates, ultimately resulting in improved spectral estimation.
Applications of the Quinn-Fernandes Fourier Transform
The Quinn-Fernandes Fourier Transform has found applications in various fields, including signal processing, telecommunications, geophysics, and biomedical engineering. Its ability to accurately identify dominant frequencies in noisy signals makes it a valuable tool for analyzing and interpreting data in these domains.
For example, in telecommunications, the Quinn-Fernandes Fourier Transform can be used to analyze the performance of communication systems and identify interference patterns. In geophysics, it can help detect and analyze seismic signals and vibrations, leading to improved understanding of geological processes. In biomedical engineering, the technique can be employed to analyze physiological signals, such as electrocardiograms, leading to more accurate diagnoses and better patient care.
Use of the Quinn-Fernandes Fourier Transform in Financial Time Series Analysis
In financial time series analysis, the Quinn-Fernandes Fourier Transform can be a powerful tool for isolating the dominant cycles and frequencies in asset price data. By more accurately identifying these critical cycles, traders can better understand the underlying dynamics of financial markets and develop more effective trading strategies.
The Quinn-Fernandes Fourier Transform is used in conjunction with the Hodrick-Prescott Filter, a technique that separates the underlying trend from the cyclical component in a time series. By first applying the Hodrick-Prescott Filter to the financial data, short-term fluctuations and noise are removed, resulting in a smoothed representation of the underlying trend. This smoothed data is then subjected to the Quinn-Fernandes Fourier Transform, allowing for more accurate identification of the dominant cycles and frequencies in the asset price data.
By employing the Quinn-Fernandes Fourier Transform in this manner, traders can gain a deeper understanding of the underlying dynamics of financial time series and develop more effective trading strategies. The enhanced knowledge of market cycles and frequencies can lead to improved risk management and ultimately, better investment performance.
The Quinn-Fernandes Fourier Transform is an advanced spectral estimation technique that has proven valuable in various domains, including financial time series analysis. Its adaptive approach to frequency identification addresses the limitations of the classical Fourier Transform when analyzing noisy signals, leading to more accurate and reliable analysis. By employing the Quinn-Fernandes Fourier Transform in financial time series analysis, traders can gain a deeper understanding of the underlying financial instrument.
Drawbacks to the Quinn-Fernandes algorithm
While the Quinn-Fernandes Fourier Transform is an effective tool for identifying dominant cycles and frequencies in financial time series, it is not without its drawbacks. Some of the limitations and challenges associated with this indicator include:
1. Computational complexity: The adaptive nature of the Quinn-Fernandes Fourier Transform requires iterative calculations, which can lead to increased computational complexity. This can be particularly challenging when analyzing large datasets or when the indicator is used in real-time trading environments.
2. Sensitivity to input parameters: The performance of the Quinn-Fernandes Fourier Transform is dependent on the choice of input parameters, such as the number of harmonic periods, frequency tolerance, and Hodrick-Prescott filter settings. Choosing inappropriate parameter values can lead to inaccurate frequency identification or reduced performance. Finding the optimal parameter settings can be challenging, and may require trial and error or a more sophisticated optimization process.
3. Assumption of stationary data: The Quinn-Fernandes Fourier Transform assumes that the underlying data is stationary, meaning that its statistical properties do not change over time. However, financial time series data is often non-stationary, with changing trends and volatility. This can limit the effectiveness of the indicator and may require additional preprocessing steps, such as detrending or differencing, to ensure the data meets the assumptions of the algorithm.
4. Limitations in noisy environments: Although the Quinn-Fernandes Fourier Transform is designed to handle noisy signals, its performance may still be negatively impacted by significant noise levels. In such cases, the identification of dominant frequencies may become less reliable, leading to suboptimal trading signals or strategies.
5. Lagging indicator: As with many technical analysis tools, the Quinn-Fernandes Fourier Transform is a lagging indicator, meaning that it is based on past data. While it can provide valuable insights into historical market dynamics, its ability to predict future price movements may be limited. This can result in false signals or late entries and exits, potentially reducing the effectiveness of trading strategies based on this indicator.
Despite these drawbacks, the Quinn-Fernandes Fourier Transform remains a valuable tool for financial time series analysis when used appropriately. By being aware of its limitations and adjusting input parameters or preprocessing steps as needed, traders can still benefit from its ability to identify dominant cycles and frequencies in financial data, and use this information to inform their trading strategies.
█ Deep-dive into the Hodrick-Prescott Fitler
The Hodrick-Prescott (HP) filter is a statistical tool used in economics and finance to separate a time series into two components: a trend component and a cyclical component. It is a powerful tool for identifying long-term trends in economic and financial data and is widely used by economists, central banks, and financial institutions around the world.
The HP filter was first introduced in the 1990s by economists Robert Hodrick and Edward Prescott. It is a simple, two-parameter filter that separates a time series into a trend component and a cyclical component. The trend component represents the long-term behavior of the data, while the cyclical component captures the shorter-term fluctuations around the trend.
The HP filter works by minimizing the following objective function:
Minimize: (Sum of Squared Deviations) + λ (Sum of Squared Second Differences)
Where:
1. The first term represents the deviation of the data from the trend.
2. The second term represents the smoothness of the trend.
3. λ is a smoothing parameter that determines the degree of smoothness of the trend.
The smoothing parameter λ is typically set to a value between 100 and 1600, depending on the frequency of the data. Higher values of λ lead to a smoother trend, while lower values lead to a more volatile trend.
The HP filter has several advantages over other smoothing techniques. It is a non-parametric method, meaning that it does not make any assumptions about the underlying distribution of the data. It also allows for easy comparison of trends across different time series and can be used with data of any frequency.
Another significant advantage of the HP Filter is its ability to adapt to changes in the underlying trend. This feature makes it particularly well-suited for analyzing financial time series, which often exhibit non-stationary behavior. By employing the HP Filter to smooth financial data, traders can more accurately identify and analyze the long-term trends that drive asset prices, ultimately leading to better-informed investment decisions.
However, the HP filter also has some limitations. It assumes that the trend is a smooth function, which may not be the case in some situations. It can also be sensitive to changes in the smoothing parameter λ, which may result in different trends for the same data. Additionally, the filter may produce unrealistic trends for very short time series.
Despite these limitations, the HP filter remains a valuable tool for analyzing economic and financial data. It is widely used by central banks and financial institutions to monitor long-term trends in the economy, and it can be used to identify turning points in the business cycle. The filter can also be used to analyze asset prices, exchange rates, and other financial variables.
The Hodrick-Prescott filter is a powerful tool for analyzing economic and financial data. It separates a time series into a trend component and a cyclical component, allowing for easy identification of long-term trends and turning points in the business cycle. While it has some limitations, it remains a valuable tool for economists, central banks, and financial institutions around the world.
█ Combined Application of Fourier Transform and Hodrick-Prescott Filter
The integration of the Fourier Transform and the Hodrick-Prescott Filter in financial time series analysis can offer several benefits. By first applying the HP Filter to the financial data, traders can remove short-term fluctuations and noise, effectively isolating the underlying trend. This smoothed data can then be subjected to the Fourier Transform, allowing for the identification of dominant cycles and frequencies with greater precision.
By combining these two powerful techniques, traders can gain a more comprehensive understanding of the underlying dynamics of financial time series. This enhanced knowledge can lead to the development of more effective trading strategies, better risk management, and ultimately, improved investment performance.
The Fourier Transform and the Hodrick-Prescott Filter are powerful tools for financial time series analysis. Each technique offers unique benefits, with the Fourier Transform being adept at identifying dominant cycles and frequencies, and the HP Filter excelling at isolating long-term trends from short-term noise. By combining these methodologies, traders can develop a deeper understanding of the underlying dynamics of financial time series, leading to more informed investment decisions and improved trading strategies. As the financial markets continue to evolve, the combined application of these techniques will undoubtedly remain an essential aspect of modern financial analysis.
█ Features
Endpointed and Non-repainting
This is an endpointed and non-repainting indicator. These are crucial factors that contribute to its usefulness and reliability in trading and investment strategies. Let us break down these concepts and discuss why they matter in the context of a financial indicator.
1. Endpoint nature: An endpoint indicator uses the most recent data points to calculate its values, ensuring that the output is timely and reflective of the current market conditions. This is in contrast to non-endpoint indicators, which may use earlier data points in their calculations, potentially leading to less timely or less relevant results. By utilizing the most recent data available, the endpoint nature of this indicator ensures that it remains up-to-date and relevant, providing traders and investors with valuable and actionable insights into the market dynamics.
2. Non-repainting characteristic: A non-repainting indicator is one that does not change its values or signals after they have been generated. This means that once a signal or a value has been plotted on the chart, it will remain there, and future data will not affect it. This is crucial for traders and investors, as it offers a sense of consistency and certainty when making decisions based on the indicator's output.
Repainting indicators, on the other hand, can change their values or signals as new data comes in, effectively "repainting" the past. This can be problematic for several reasons:
a. Misleading results: Repainting indicators can create the illusion of a highly accurate or successful trading system when backtesting, as the indicator may adapt its past signals to fit the historical price data. This can lead to overly optimistic performance results that may not hold up in real-time trading.
b. Decision-making uncertainty: When an indicator repaints, it becomes challenging for traders and investors to trust its signals, as the signal that prompted a trade may change or disappear after the fact. This can create confusion and indecision, making it difficult to execute a consistent trading strategy.
The endpoint and non-repainting characteristics of this indicator contribute to its overall reliability and effectiveness as a tool for trading and investment decision-making. By providing timely and consistent information, this indicator helps traders and investors make well-informed decisions that are less likely to be influenced by misleading or shifting data.
Inputs
Source: This input determines the source of the price data to be used for the calculations. Users can select from options like closing price, opening price, high, low, etc., based on their preferences. Changing the source of the price data (e.g., from closing price to opening price) will alter the base data used for calculations, which may lead to different patterns and cycles being identified.
Calculation Bars: This input represents the number of past bars used for the calculation. A higher value will use more historical data for the analysis, while a lower value will focus on more recent price data. Increasing the number of past bars used for calculation will incorporate more historical data into the analysis. This may lead to a more comprehensive understanding of long-term trends but could also result in a slower response to recent price changes. Decreasing this value will focus more on recent data, potentially making the indicator more responsive to short-term fluctuations.
Harmonic Period: This input represents the harmonic period, which is the number of harmonics used in the Fourier Transform. A higher value will result in more harmonics being used, potentially capturing more complex cycles in the price data. Increasing the harmonic period will include more harmonics in the Fourier Transform, potentially capturing more complex cycles in the price data. However, this may also introduce more noise and make it harder to identify clear patterns. Decreasing this value will focus on simpler cycles and may make the analysis clearer, but it might miss out on more complex patterns.
Frequency Tolerance: This input represents the frequency tolerance, which determines how close the frequencies of the harmonics must be to be considered part of the same cycle. A higher value will allow for more variation between harmonics, while a lower value will require the frequencies to be more similar. Increasing the frequency tolerance will allow for more variation between harmonics, potentially capturing a broader range of cycles. However, this may also introduce noise and make it more difficult to identify clear patterns. Decreasing this value will require the frequencies to be more similar, potentially making the analysis clearer, but it might miss out on some cycles.
Number of Bars to Render: This input determines the number of bars to render on the chart. A higher value will result in more historical data being displayed, but it may also slow down the computation due to the increased amount of data being processed. Increasing the number of bars to render on the chart will display more historical data, providing a broader context for the analysis. However, this may also slow down the computation due to the increased amount of data being processed. Decreasing this value will speed up the computation, but it will provide less historical context for the analysis.
Smoothing Mode: This input allows the user to choose between two smoothing modes for the source price data: no smoothing or Hodrick-Prescott (HP) smoothing. The choice depends on the user's preference for how the price data should be processed before the Fourier Transform is applied. Choosing between no smoothing and Hodrick-Prescott (HP) smoothing will affect the preprocessing of the price data. Using HP smoothing will remove some of the short-term fluctuations from the data, potentially making the analysis clearer and more focused on longer-term trends. Not using smoothing will retain the original price fluctuations, which may provide more detail but also introduce noise into the analysis.
Hodrick-Prescott Filter Period: This input represents the Hodrick-Prescott filter period, which is used if the user chooses to apply HP smoothing to the price data. A higher value will result in a smoother curve, while a lower value will retain more of the original price fluctuations. Increasing the Hodrick-Prescott filter period will result in a smoother curve for the price data, emphasizing longer-term trends and minimizing short-term fluctuations. Decreasing this value will retain more of the original price fluctuations, potentially providing more detail but also introducing noise into the analysis.
Alets and signals
This indicator featues alerts, signals and bar coloring. You have to option to turn these on/off in the settings menu.
Maximum Bars Restriction
This indicator requires a large amount of processing power to render on the chart. To reduce overhead, the setting "Number of Bars to Render" is set to 500 bars. You can adjust this to you liking.
█ Related Indicators and Libraries
Goertzel Cycle Composite Wave
Goertzel Browser
Fourier Spectrometer of Price w/ Extrapolation Forecast
Fourier Extrapolator of 'Caterpillar' SSA of Price
Normalized, Variety, Fast Fourier Transform Explorer
Real-Fast Fourier Transform of Price Oscillator
Real-Fast Fourier Transform of Price w/ Linear Regression
Fourier Extrapolation of Variety Moving Averages
Fourier Extrapolator of Variety RSI w/ Bollinger Bands
Fourier Extrapolator of Price w/ Projection Forecast
Fourier Extrapolator of Price
STD-Stepped Fast Cosine Transform Moving Average
Variety RSI of Fast Discrete Cosine Transform
loxfft
Smoothing R-Squared ComparisonIntroduction
Heyo guys, here I made a comparison between my favorised smoothing algorithms.
I chose the R-Squared value as rating factor to accomplish the comparison.
The indicator is non-repainting.
Description
In technical analysis, traders often use moving averages to smooth out the noise in price data and identify trends. While moving averages are a useful tool, they can also obscure important information about the underlying relationship between the price and the smoothed price.
One way to evaluate this relationship is by calculating the R-squared value, which represents the proportion of the variance in the price that can be explained by the smoothed price in a linear regression model.
This PineScript code implements a smoothing R-squared comparison indicator.
It provides a comparison of different smoothing techniques such as Kalman filter, T3, JMA, EMA, SMA, Super Smoother and some special combinations of them.
The Kalman filter is a mathematical algorithm that uses a series of measurements observed over time, containing statistical noise and other inaccuracies, and produces estimates of unknown variables that tend to be more accurate than those based on a single measurement.
The input parameters for the Kalman filter include the process noise covariance and the measurement noise covariance, which help to adjust the sensitivity of the filter to changes in the input data.
The T3 smoothing technique is a popular method used in technical analysis to remove noise from a signal.
The input parameters for the T3 smoothing method include the length of the window used for smoothing, the type of smoothing used (Normal or New), and the smoothing factor used to adjust the sensitivity to changes in the input data.
The JMA smoothing technique is another popular method used in technical analysis to remove noise from a signal.
The input parameters for the JMA smoothing method include the length of the window used for smoothing, the phase used to shift the input data before applying the smoothing algorithm, and the power used to adjust the sensitivity of the JMA to changes in the input data.
The EMA and SMA techniques are also popular methods used in technical analysis to remove noise from a signal.
The input parameters for the EMA and SMA techniques include the length of the window used for smoothing.
The indicator displays a comparison of the R-squared values for each smoothing technique, which provides an indication of how well the technique is fitting the data.
Higher R-squared values indicate a better fit. By adjusting the input parameters for each smoothing technique, the user can compare the effectiveness of different techniques in removing noise from the input data.
Usage
You can use it to find the best fitting smoothing method for the timeframe you usually use.
Just apply it on your preferred timeframe and look for the highlighted table cell.
Conclusion
It seems like the T3 works best on timeframes under 4H.
There's where I am active, so I will use this one more in the future.
Thank you for checking this out. Enjoy your day and leave me a like or comment. 🧙♂️
---
Credits to:
▪@loxx – T3
▪@balipour – Super Smoother
▪ChatGPT – Wrote 80 % of this article and helped with the research
Spectral Gating (SG)The Spectral Gating (SG) Indicator is a technical analysis tool inspired by music production techniques. It aims to help traders reduce noise in their charts by focusing on the significant frequency components of the data, providing a clearer view of market trends.
By incorporating complex number operations and Fast Fourier Transform (FFT) algorithms, the SG Indicator efficiently processes market data. The indicator transforms input data into the frequency domain and applies a threshold to the power spectrum, filtering out noise and retaining only the frequency components that exceed the threshold.
Key aspects of the Spectral Gating Indicator include:
Adjustable Window Size: Customize the window size (ranging from 2 to 6) to control the amount of data considered during the analysis, giving you the flexibility to adapt the indicator to your trading strategy.
Complex Number Arithmetic: The indicator uses complex number addition, subtraction, and multiplication, as well as radius calculations for accurate data processing.
Iterative FFT and IFFT: The SG Indicator features iterative FFT and Inverse Fast Fourier Transform (IFFT) algorithms for rapid data analysis. The FFT algorithm converts input data into the frequency domain, while the IFFT algorithm restores the filtered data back to the time domain.
Spectral Gating: At the heart of the indicator, the spectral gating function applies a threshold to the power spectrum, suppressing frequency components below the threshold. This process helps to enhance the clarity of the data by reducing noise and focusing on the more significant frequency components.
Visualization: The indicator plots the filtered data on the chart with a simple blue line, providing a clean and easily interpretable representation of the results.
Although the Spectral Gating Indicator may not be a one-size-fits-all solution for all trading scenarios, it serves as a valuable tool for traders looking to reduce noise and concentrate on relevant market trends. By incorporating this indicator into your analysis toolkit, you can potentially make more informed trading decisions.
PSv5 3D Array/Matrix Super Hack"In a world of ever pervasive and universal deceit, telling a simple truth is considered a revolutionary act."
INTRO:
First, how about a little bit of philosophic poetry with another dimension applied to it?
The "matrix of control" is everywhere...
It is all around us, even now in the very place you reside. You can see it when you look at your digitized window outwards into the world, or when you turn on regularly scheduled television "programs" to watch news narratives and movies that subliminally influence your thoughts, feelings, and emotions. You have felt it every time you have clocked into dead end job workplaces... when you unknowingly worshiped on the conformancy alter to cultish ideologies... and when you pay your taxes to a godvernment that is poisoning you softly and quietly by injecting your mind and body with (psyOps + toxicCompounds). It is a fictitiously generated world view that has been pulled over your eyes to blindfold, censor, and mentally prostrate you from spiritually hearing the real truth.
What TRUTH you must wonder? That you are cognitively enslaved, like everyone else. You were born into mental bondage, born into an illusory societal prison complex that you are entirely incapable of smelling, tasting, or touching. Its a contrived monetary prison enterprise for your mind and eternal soul, built by pretending politicians, corporate CONartists, and NonGoverning parasitic Organizations deploying any means of infiltration and deception by using every tactic unimaginable. You are slowly being convinced into becoming a genetically altered cyborg by acclimation, socially engineered and chipped to eventually no longer be 100% human.
Unfortunately no one can be told eloquently enough in words what the matrix of control truly is. You have to experience it and witness it for yourself. This is your chance to program a future paradigm that doesn't yet exist. After visiting here, there is absolutely no turning back. You can continually take the blue pill BIGpharmacide wants you to repeatedly intake. The story ends if you continually sleep walk through a 2D hologram life, believing whatever you wish to believe until you cease to exist. OR, you can take the red pill challenge, explore "question every single thing" wonderland, program your arse off with 3D capabilities, ultimately ascertaining a new mathematical empyrean. Only then can you fully awaken to discover how deep the rabbit hole state of affairs transpire worldwide with a genuine open mind.
Remember, all I'm offering is a mathematical truth, nothing more...
PURPOSE:
With that being said above, it is now time for advanced developers to start creating their own matrix constructs in 3D, in Pine, just as the universe is created spatially. For those of you who instantly know what this script's potential is easily capable of, you already know what you have to do with it. While this is simplistically just a 3D array for either integers or floats, additional companion functions can in the future be constructed by other members to provide a more complete matrix/array library for millions of folks on TV. I do encourage the most courageous of mathemagicians on TV to do so. I have been employing very large 2D/3D array structures for quite some time, and their utility seems to be of great benefit. Discovering that for myself, I fully realized that Pine is incomplete and must be provided with this agility to process complex datasets that traders WILL use in the future. Mark my words!
CONCEPTION:
While I have long realized and theorized this code for a great duration of time, I was finally able to turn it into a Pine reality with the assistance and training of an "artificially intuitive" program while probing its aptitude. Even though it knows virtually nothing about Pine Script 4.0 or 5.0 syntax, functions, and behavior, I was able to conjure code into an identity similar to what you see now within a few minutes. Close enough for me! Many manual edits later for pine compliance, and I had it in chart, presto!
While most people consider the service to be an "AI", it didn't pass my Pine Turing test. I did have to repeatedly correct it, suffered through numerous apologies from it, was forced to use specifically tailored words, and also rationally debate AND argued with it. It is a handy helper but beware of generating Pine code from it, trust me on this one. However... this artificially intuitive service is currently available in its infancy as version 3. Version 4 most likely will have more diversity to enhance my algorithmic expertise of Pine wizardry. I do have to thank E.M. and his developers for an eye opening experience, or NONE of this code below would be available as you now witness it today.
LIMITATIONS:
As of this initial release, Pine only supports 100,000 array elements maximum. For example, when using this code, a 50x50x40 element configuration will exceed this limit, but 50x50x39 will work. You will always have to keep that in mind during development. Running that size of an array structure on every single bar will most likely time out within 20-40 seconds. This is not the most efficient method compared to a real native 3D array in action. Ehlers adepts, this might not be 100% of what you require to "move forward". You can try, but head room with a low ceiling currently will be challenging to walk in for now, even with extremely optimized Pine code.
A few common functions are provided, but this can be extended extensively later if you choose to undertake that endeavor. Use the code as is and/or however you deem necessary. Any TV member is granted absolute freedom to do what they wish as they please. I ultimately wish to eventually see a fully equipped library version for both matrix3D AND array3D created by collaborative efforts that will probably require many Pine poets testing collectively. This is just a bare bones prototype until that day arrives. Considerably more computational server power will be required also. Anyways, I hope you shall find this code somewhat useful.
Notice: Unfortunately, I will not provide any integration support into members projects at all. I have my own projects that require too much of my time already.
POTENTIAL APPLICATIONS:
The creation of very large coefficient 3D caches/buffers specifically at bar_index==0 can dramatically increase runtime agility for thousands of bars onwards. Generating 1000s of values once and just accessing those generated values is much faster. Also, when running dozens of algorithms simultaneously, a record of performance statistics can be kept, self-analyzed, and visually presented to the developer/user. And, everything else under the sun can be created beyond a developers wildest dreams...
EPILOGUE:
Free your mind!!! And unleash weapons of mass financial creation upon the earth for all to utilize via the "Power of Pine". Flying monkeys and minions are waging economic sabotage upon humanity, decimating markets and exchanges. You can always see it your market charts when things go horribly wrong. This is going to be an astronomical technical challenge to continually navigate very choppy financial markets that are increasingly becoming more and more unstable and volatile. Ordinary one plot algorithms simply are not enough anymore. Statistics and analysis sits above everything imagined. This includes banking, godvernment, corporations, REAL science, technology, health, medicine, transportation, energy, food, etc... We have a unique perspective of the world that most people will never get to see, depending on where you look. With an ever increasingly complex world in constant dynamic flux, novel ways to process data intricately MUST emerge into existence in order to tackle phenomenal tasks required in the future. Achieving data analysis in 3D forms is just one lonely step of many more to come.
At this time the WesternEconomicFraudsters and the WorldHealthOrders are attempting to destroy/reset the world's financial status in order to rain in chaos upon most nations, causing asset devaluation and hyper-inflation. Every form of deception, infiltration, and theft is occurring with a result of destroyed wealth in preparation to consolidate it. Open discussions, available to the public, by world leaders/moguls are fantasizing about new dystopian system as a one size fits all nations solution of digitalID combined with programmableDemonicCurrencies to usher in a new form of obedient servitude to a unipolar digitized hegemony of monetary vampires. If they do succeed with economic conquest, as they have publicly stated, people will be converted into human cattle, herded within smart cities, you will own nothing, eat bugs for breakfast/lunch/dinner, live without heat during severe winter conditions, and be happy. They clearly haven't done the math, as they are far outnumbered by a ratio of 1 to millions. Sith Lords do not own planet Earth! The new world disorder of human exploitation will FAIL. History, my "greatest teacher" for decades reminds us over, and over, and over again, and what are time series for anyways? They are for an intense mathematical analysis of prior historical values/conditions in relation to today's values/conditions... I imagine one day we will be able to ask an all-seeing AI, "WHO IS TO BLAME AND WHY AND WHEN?" comprised of 300 pages in great detail with images, charts, and statistics.
What are the true costs of malignant lies? I will tell you... 64bit numbers are NOT even capable of calculating the extreme cost of pernicious lies and deceit. That's how gigantic this monstrous globalization problem has become and how awful the "matrix of control" truly is now. ALL nations need a monumental revision of its CODE OF ETHICS, and that's definitely a multi-dimensional problem that needs solved sooner than later. If it was up to me, economies and technology would be developed so extensively to eliminate scarcity and increase the standard of living so high, that the notion of war and conflict would be considered irrelevant and extremely appalling to the future generations of humanity, our grandchildren born and unborn. The future will not be owned and operated by geriatric robber barons destined to expire quickly. The future will most likely be intensely "guided" by intelligent open source algorithms that youthful generations will inherit as their birth right.
P.S. Don't give me that politco-my-diction crap speech below in comments. If they weren't meddling with economics mucking up 100% of our chart results in 100% of tickers, I wouldn't have any cause to analyze any effects generated by them, nor provide this script's code. I am performing my analytical homework, but have you? Do you you know WHY international affairs are in dire jeopardy? Without why, the "Power of Pine" would have never existed as it specifically does today. I'm giving away much of my mental power generously to TV members so you are specifically empowered beyond most mathematical agilities commonly existing. I'm just a messenger of profound ideas. Loving and loathing of words is ALWAYS in the eye of beholders, and that's why the freedom of speech is enshrined as #1 in the constitutional code of the USA. Without it, this entire site might not have been allowed to exist from its founder's inceptions.
Levinson-Durbin Autocorrelation Extrapolation of Price [Loxx]Levinson-Durbin Autocorrelation Extrapolation of Price is an indicator that uses the Levinson recursion or Levinson–Durbin recursion algorithm to predict price moves. This method is commonly used in speech modeling and prediction engines.
What is Levinson recursion or Levinson–Durbin recursion?
Is a linear algebra prediction analysis that is performed once per bar using the autocorrelation method with a within a specified asymmetric window. The autocorrelation coefficients of the window are computed and converted to LP coefficients using the Levinson algorithm. The LP coefficients are then transformed to line spectrum pairs for quantization and interpolation. The interpolated quantized and unquantized filters are converted back to the LP filter coefficients to construct the synthesis and weighting filters for each bar.
Data inputs
Source Settings: -Loxx's Expanded Source Types. You typically use "open" since open has already closed on the current active bar
LastBar - bar where to start the prediction
PastBars - how many bars back to model
LPOrder - order of linear prediction model; 0 to 1
FutBars - how many bars you want to forward predict
Things to know
Normally, a simple moving average is caculated on source data. I've expanded this to 38 different averaging methods using Loxx's Moving Avreages.
This indicator repaints
Included
Bar color muting
Further reading
Implementing the Levinson-Durbin Algorithm on the StarCore™ SC140/SC1400 Cores
LevinsonDurbin_G729 Algorithm, Calculates LP coefficients from the autocorrelation coefficients. Intel® Integrated Performance Primitives for Intel® Architecture Reference Manual
APA-Adaptive, Ehlers Early Onset Trend [Loxx]APA-Adaptive, Ehlers Early Onset Trend is Ehlers Early Onset Trend but with Autocorrelation Periodogram Algorithm dominant cycle period input.
What is Ehlers Early Onset Trend?
The Onset Trend Detector study is a trend analyzing technical indicator developed by John F. Ehlers , based on a non-linear quotient transform. Two of Mr. Ehlers' previous studies, the Super Smoother Filter and the Roofing Filter, were used and expanded to create this new complex technical indicator. Being a trend-following analysis technique, its main purpose is to address the problem of lag that is common among moving average type indicators.
The Onset Trend Detector first applies the EhlersRoofingFilter to the input data in order to eliminate cyclic components with periods longer than, for example, 100 bars (default value, customizable via input parameters) as those are considered spectral dilation. Filtered data is then subjected to re-filtering by the Super Smoother Filter so that the noise (cyclic components with low length) is reduced to minimum. The period of 10 bars is a default maximum value for a wave cycle to be considered noise; it can be customized via input parameters as well. Once the data is cleared of both noise and spectral dilation, the filter processes it with the automatic gain control algorithm which is widely used in digital signal processing. This algorithm registers the most recent peak value and normalizes it; the normalized value slowly decays until the next peak swing. The ratio of previously filtered value to the corresponding peak value is then quotiently transformed to provide the resulting oscillator. The quotient transform is controlled by the K coefficient: its allowed values are in the range from -1 to +1. K values close to 1 leave the ratio almost untouched, those close to -1 will translate it to around the additive inverse, and those close to zero will collapse small values of the ratio while keeping the higher values high.
Indicator values around 1 signify uptrend and those around -1, downtrend.
What is an adaptive cycle, and what is Ehlers Autocorrelation Periodogram Algorithm?
From his Ehlers' book Cycle Analytics for Traders Advanced Technical Trading Concepts by John F. Ehlers , 2013, page 135:
"Adaptive filters can have several different meanings. For example, Perry Kaufman’s adaptive moving average ( KAMA ) and Tushar Chande’s variable index dynamic average ( VIDYA ) adapt to changes in volatility . By definition, these filters are reactive to price changes, and therefore they close the barn door after the horse is gone.The adaptive filters discussed in this chapter are the familiar Stochastic , relative strength index ( RSI ), commodity channel index ( CCI ), and band-pass filter.The key parameter in each case is the look-back period used to calculate the indicator. This look-back period is commonly a fixed value. However, since the measured cycle period is changing, it makes sense to adapt these indicators to the measured cycle period. When tradable market cycles are observed, they tend to persist for a short while.Therefore, by tuning the indicators to the measure cycle period they are optimized for current conditions and can even have predictive characteristics.
The dominant cycle period is measured using the Autocorrelation Periodogram Algorithm. That dominant cycle dynamically sets the look-back period for the indicators. I employ my own streamlined computation for the indicators that provide smoother and easier to interpret outputs than traditional methods. Further, the indicator codes have been modified to remove the effects of spectral dilation.This basically creates a whole new set of indicators for your trading arsenal."
Jurik Composite Fractal Behavior (CFB) on EMA [Loxx]Jurik Composite Fractal Behavior (CFB) on EMA is an exponential moving average with adaptive price trend duration inputs. This purpose of this indicator is to introduce the formulas for the calculation Composite Fractal Behavior. As you can see from the chart above, price reacts wildly to shifts in volatility--smoothing out substantially while riding a volatility wave and cutting sharp corners when volatility drops. Notice the chop zone on BTC around August 2021, this was a time of extremely low relative volatility.
This indicator uses three previous indicators from my public scripts. These are:
JCFBaux Volatility
Jurik Filter
Jurik Volty
The CFB is also related to the following indicator
Jurik Velocity ("smoother moment")
Now let's dive in...
What is Composite Fractal Behavior (CFB)?
All around you mechanisms adjust themselves to their environment. From simple thermostats that react to air temperature to computer chips in modern cars that respond to changes in engine temperature, r.p.m.'s, torque, and throttle position. It was only a matter of time before fast desktop computers applied the mathematics of self-adjustment to systems that trade the financial markets.
Unlike basic systems with fixed formulas, an adaptive system adjusts its own equations. For example, start with a basic channel breakout system that uses the highest closing price of the last N bars as a threshold for detecting breakouts on the up side. An adaptive and improved version of this system would adjust N according to market conditions, such as momentum, price volatility or acceleration.
Since many systems are based directly or indirectly on cycles, another useful measure of market condition is the periodic length of a price chart's dominant cycle, (DC), that cycle with the greatest influence on price action.
The utility of this new DC measure was noted by author Murray Ruggiero in the January '96 issue of Futures Magazine. In it. Mr. Ruggiero used it to adaptive adjust the value of N in a channel breakout system. He then simulated trading 15 years of D-Mark futures in order to compare its performance to a similar system that had a fixed optimal value of N. The adaptive version produced 20% more profit!
This DC index utilized the popular MESA algorithm (a formulation by John Ehlers adapted from Burg's maximum entropy algorithm, MEM). Unfortunately, the DC approach is problematic when the market has no real dominant cycle momentum, because the mathematics will produce a value whether or not one actually exists! Therefore, we developed a proprietary indicator that does not presuppose the presence of market cycles. It's called CFB (Composite Fractal Behavior) and it works well whether or not the market is cyclic.
CFB examines price action for a particular fractal pattern, categorizes them by size, and then outputs a composite fractal size index. This index is smooth, timely and accurate
Essentially, CFB reveals the length of the market's trending action time frame. Long trending activity produces a large CFB index and short choppy action produces a small index value. Investors have found many applications for CFB which involve scaling other existing technical indicators adaptively, on a bar-to-bar basis.
What is Jurik Volty used in the Juirk Filter?
One of the lesser known qualities of Juirk smoothing is that the Jurik smoothing process is adaptive. "Jurik Volty" (a sort of market volatility ) is what makes Jurik smoothing adaptive. The Jurik Volty calculation can be used as both a standalone indicator and to smooth other indicators that you wish to make adaptive.
What is the Jurik Moving Average?
Have you noticed how moving averages add some lag (delay) to your signals? ... especially when price gaps up or down in a big move, and you are waiting for your moving average to catch up? Wait no more! JMA eliminates this problem forever and gives you the best of both worlds: low lag and smooth lines.
Ideally, you would like a filtered signal to be both smooth and lag-free. Lag causes delays in your trades, and increasing lag in your indicators typically result in lower profits. In other words, late comers get what's left on the table after the feast has already begun.
Modifications and improvements
1. Jurik's original calculation for CFB only allowed for depth lengths of 24, 48, 96, and 192. For theoretical purposes, this indicator allows for up to 20 different depth inputs to sample volatility. These depth lengths are
2, 3, 4, 6, 8, 12, 16, 24, 32, 48, 64, 96, 128, 192, 256, 384, 512, 768, 1024, 1536
Including these additional length inputs is arguable useless, but they are are included for completeness of the algorithm.
2. The result of the CFB calculation is forced to be an integer greater than or equal to 1.
3. The result of the CFB calculation is double filtered using an advanced, (and adaptive itself) filtering algorithm called the Jurik Filter. This filter and accompanying internal algorithm are discussed above.
Customizable Non-Repainting HTF MACD MFI Scalper Bot Strategy v2Customizable Non-Repainting HTF MACD MFI Scalper Bot Strategy v2
This script was originally shared by Wunderbit as a free open source script for the community to work with. This is my second published iteration of this idea.
WHAT THIS SCRIPT DOES:
It is intended for use on an algorithmic bot trading platform but can be used for scalping and manual trading.
This strategy is based on the trend-following momentum indicator . It includes the Money Flow index as an additional point for entry.
This is a new and improved version geared for lower timeframes (15-5 minutes), but can be run on larger ones as well. I am testing it live as my high frequency trader.
HOW IT DOES IT:
It uses a combination of MACD and MFI indicators to create entry signals. Parameters for each indicator have been surfaced for user configurability.
Take profits are now trailing profits, and the stop loss is now fixed. Why? I found that the trailing stop loss with ATR in the previous version yields very good results for back tests but becomes very difficult to deploy live due to transaction fees. As you can see the average trade is a higher profit percentage than the previous version.
HOW IS MY VERSION ORIGINAL:
Now instead of using ATR stop loss, we have a fixed stop loss - counter intuitively to what some may believe this performs better in live trading scenarios since it gives the strategy room to move. I noticed that the ATR trailing stop was stopping out too fast and was eating away balance due to transaction fees.
The take profit on the other hand is now a trailing profit with a customizable deviation. This ensures that you can have a minimum profit you want to take in order to exit.
I have depracated the old ATR trailing stop as it became too confusing to have those as different options. I kept the old version for others that want to experiment with it. The source code still requires some cleanup, but its fully functional.
I added in a way to show RSI values and ATR values with a checkbox so that you can use the new an improved ATR Filter (and grab the right RSI values for the RSI filter). This will help to filter out times of very low volatility where we are unlikely to find a profitable trade. Use the "Show Data" checkbox to see what the values are on the indicator pane, then use those values to gauge what you want to filter out.
Both versions
Delayed Signals : The script has been refactored to use a time frame drop down. The higher time frame can be run on a faster chart (recommended on one minute chart for fastest signal confirmation and relay to algotrading platform.)
Repainting Issues : All indicators have been recoded to use the security function that checks to see if the current calculation is in realtime, if it is, then it uses the previous bar for calculation. If you are still experiencing repainting issues based on intended (or non intended use), please provide a report with screenshot and explanation so I can try to address.
Filtering : I have added to additional filters an ABOVE EMA Filter and a BELOW RSI Filter (both can be turned on and off)
Customizable Long and Close Messages : This allows someone to use the script for algorithmic trading without having to alter code. It also means you can use one indicator for all of your different alterts required for your bots.
HOW TO USE IT:
It is intended to be used in the 5-30 minute time frames, but you might be able to get a good configuration for higher time frames. I welcome feedback from other users on what they have found.
Find a pair with high volatility (example KUCOIN:ETH3LUSDT ) - I have found it works particularly well with 3L and 3S tokens for crypto. although it the limitation is that confrigurations I have found to work typically have low R/R ratio, but very high win rate and profit factor.
Ideally set one minute chart for bots, but you can use other charts for manual trading. The signal will be delayed by one bar but I have found configurations that still test well.
Select a time frame in configuration for your indicator calculations.
Select the strategy config for time frame (resolution). I like to use 5 and 15 minutes for scalping scenarios, but I am interested in hearing back from other community memebers.
Optimize your indicator without filters : customize your settings for MACD and MFI that are profitable with your chart and selected time frame calculation. Try different Take Profits (try about 2-5%) and stop loss (try about 5-8%). See if your back test is profitable and continue to optimize.
Use the Trend, RSI, ATR Filter to further refine your signals for entry. You will get less entries but you can increase your win ratio.
You can use the open and close messages for a platform integration, but I choose to set mine up on the destination platform and let the platform close it. With certain platforms you cannot be sure what your entry point actually was compared to Trading View due to slippage and timing, so I let the platform decide when it is actually profitable.
Limitations: this works rather well for short term, and does some good forward testing but back testing large data sets is a problem when switching from very small time frame to large time frame. For instance, finding a configuration that works on a one minute chart but then changing to a 1 hour chart means you lose some of your intra bar calclulations. There are some new features in pine script which might be able to address, this, but I have not had a chance to work on that issue.
Relative Strength Super Smoother by lastguruA better version of Apirine's RS EMA by using a superior MA: Ehlers Super Smoother.
In January 2022 edition of TASC Vitaly Apirine introduced his Relative Strength Exponential Moving Average. A concept not entirely new, as Tushar Chande used a similar calculation for his VIDYA moving average. Both are based on the idea to change EMA length depending on the absolute RSI value, so the moving average would speed up then RSI is going up or down from the center value (when there is a significant directional price movement), and slow down when RSI returns to the center value (when there is a neutral or sideways movement). That way EMA responsiveness would increase where it matters most, but decrease where there is a high probability of whipsaw.
There are only two main differences between VIDYA and RS EMA:
RSI internal smoothing - VIDYA uses SMA, as Chande's CMO is an RSI with SMA; RS EMA uses EMA
Change direction - VIDYA sets the fastest length; RS EMA sets the slowest length
Both algorithms use EMA as the base of their calculation. As John F. Ehlers has shown in his article "Predictive and Successful Indicators" (January 2014 issue of TASC), EMA is not a very efficient filter, as it introduces a significant lag if sufficient smoothing is required. He describes a new smoothing filter called SuperSmoother, "that sharply attenuates aliasing noise while minimizing filtering lag." In other words, it provides better smoothing with lower lag than EMA.
In this script, I try to get the best of all these approaches and present to you Relative Strength Super Smoother. It uses RS EMA algorithm to calculate the SuperSmoother length. Unlike the original RS EMA algorithm, that has an abstract "multiplier" setting to scale the period variance (without this parameter, RSI would only allow it to speed up twice; Vitaly Apirine sets the multiplier to 10 by default), my implementation has explicit lower bound setting, so you can specify the exact range of calculated length.
Settings:
Lower Bound - fastest SuperSmoother length (when RSI is +100 or -100)
Upper Bound - slowest SuperSmoother length (when RSI is 0)
RSI Length - underlying RSI length. Unlike the original RSI that uses RMA as an internal smoothing algorithm, Vitaly Apirine uses EMA, which is approximately twice as fast (that is needed because he uses a generally long RSI length and RMA would be too slow for this). It is the same as the Upper Bound by default (0), as in the original implementation
The original RS EMA is also shown on the chart for comparison. The default multiplier of 10 for RS EMA means that the fastest EMA period is around 4. I use the fastest period of 8 by default. It does not introduce too much of a lag in comparison, but the curve is much smoother.
This script is just an interface for my public libraries. Check them out for more information.
Bogdan Ciocoiu - MakaveliDescription
This indicator integrates the functionality of multiple volume price analysis algorithms whilst aligning their scales to fit in a single chart.
Having such indicators loaded enables traders to take advantage of potential divergences between the price action and volume related volatility.
Users will have to enable or disable alternative algorithms depending on their choice.
Uniqueness
This indicator is unique because it combines multiple algorithm-specific two-volume analyses with price volatility.
This indicator is also unique because it amends different algorithms to show output on a similar scale enabling traders to observe various volume-analysis tools simultaneously whilst allocating different colour codes.
Open source re-use
This indicator utilises the following open-source scripts:
Bogdan Ciocoiu - Sniper EntryWhat is Sniper Entry
Sniper Entry is a set indicator that encapsulates a collection of pre-configured scripts using specific variables that enable users to extract signals by interpreting market behaviour quickly, suitable for 1-3min scalping. This instrument is a tool that acts as a confluence for traders to make decisions concerning current market conditions. This indicator does not apply solely to an asset.
What Sniper Entry is not
Sniper Entry is not interpreting fundamental analysis and will also not be providing out of box market signals. Instead, it will provide a collection of integrated and significantly improved open-source subscripts designed to help traders speculate on market trends. Traders must apply their strategies and configure Sniper Entry accordingly to maximise the script's output.
Originality and usefulness
The collection of subscripts encapsulated in this tool makes it unique in the Trading View ecosystem. This indicator enables traders to consider entry positions or exit positions by comparing similar algorithms at once.
Its usefulness also emerges from the unique configurations embedded in the indicator's settings, which are different from those of the original scripts.
This indicator's originality is also reflected in how its modules are integrated, including the integration of the settings.
Open-source reuse
I used the following open-source resources, which I simplified significantly and pre-configured for short term scalping. The source codes for the below are already in the public domain, including the following links listed below.
www.tradingview.com (open source)
(open source and generic algorithm)
www.tradingview.com (open source)
(open source)
(open source)
www.tradingview.com (generic MA algorithm and open source)
(generic VWAP algorithm and open source)
Acrypto - Weighted StrategyHello traders!
I have been developing a fully customizable algo over the last year. The algorithm is based on a set of different strategies, each with its own weight (weighted strategy). The set of strategies that I currently use are 5:
MACD
Stochastic RSI
RSI
Supertrend
MA crossover
Moreover, the algo includes STOP losses criteria and a taking profit strategy. The algo must be optimized for the desired asset to achieves its full potential. The 1H and 4H dataframe give good results. The algo has been tested for several asset (same dataframe, different optimization values).
Important note:
Backtest the algorithm with different data stamps to avoid overfitting results
Best,
Alberto
FunctionArrayMaxSubKadanesAlgorithmLibrary "FunctionArrayMaxSubKadanesAlgorithm"
Implements Kadane's maximum sum sub array algorithm.
size(samples) Kadanes algorithm.
Parameters:
samples : float array, sample data values.
Returns: float.
indices(samples) Kadane's algorithm with indices.
Parameters:
samples : float array, sample data values.
Returns: tuple with format .
MathSearchDijkstraLibrary "MathSearchDijkstra"
Shortest Path Tree Search Methods using Dijkstra Algorithm.
min_distance(distances, flagged_vertices) Find the lowest cost/distance.
Parameters:
distances : float array, data set with distance costs to start index.
flagged_vertices : bool array, data set with visited vertices flags.
Returns: int, lowest cost/distance index.
dijkstra(matrix_graph, dim_x, dim_y, start) Dijkstra Algorithm, perform a greedy tree search to calculate the cost/distance to selected start node at each vertex.
Parameters:
matrix_graph : int array, matrix holding the graph adjacency list and costs/distances.
dim_x : int, x dimension of matrix_graph.
dim_y : int, y dimension of matrix_graph.
start : int, the vertex index to start search.
Returns: int array, set with costs/distances to each vertex from start vertexs.
shortest_path(start, end, matrix_graph, dim_x, dim_y) Retrieves the shortest path between 2 vertices in a graph using Dijkstra Algorithm.
Parameters:
start : int, the vertex index to start search.
end : int, the vertex index to end search.
matrix_graph : int array, matrix holding the graph adjacency list and costs/distances.
dim_x : int, x dimension of matrix_graph.
dim_y : int, y dimension of matrix_graph.
Returns: int array, set with vertex indices to the shortest path.
P-Square - Estimation of the Nth percentile of a seriesEstimation of the Nth percentile of a series
When working with built-in functions in TradingView we have to limit our length parameters to max 4999. In case we want to use a function on the whole available series (bar 0 all the way to the current bar), we can usually not do this without manually creating these calculations in our code. For things like mean or standard deviation, this is quite trivial, but for things like percentiles, this is usually very costly. In more complex scripts, this becomes impossible because of resource restrictions from the Pine Script execution servers.
One solution to this is to use an estimation algorithm to get close to the true percentile value. Therefore, I have ported this implementation of the P-Square algorithm to Pine Script. P-Square is a fast algorithm that does a good job at estimating percentiles in data streams. Here's the algorithms original paper .
The chart
On the chart we see:
The returns of the series (blue scatter plot)
The mean of the returns of the series (orange line)
The standard deviation of the returns of the series (yellow line)
The actual 84.1th percentile of the returns (white line)
The estimatedl 84.1th percentile of the returns using the P-Square algorithm (green line)
Note: We can see that the returns are not normally distributed as we can see that one standard deviation is higher than the 84.1th percentile. One standard deviation should equal the 84.1th percentile if the data is normally distributed.
Machine Learning: Logistic RegressionMulti-timeframe Strategy based on Logistic Regression algorithm
Description:
This strategy uses a classic machine learning algorithm that came from statistics - Logistic Regression (LR).
The first and most important thing about logistic regression is that it is not a 'Regression' but a 'Classification' algorithm. The name itself is somewhat misleading. Regression gives a continuous numeric output but most of the time we need the output in classes (i.e. categorical, discrete). For example, we want to classify emails into “spam” or 'not spam', classify treatment into “success” or 'failure', classify statement into “right” or 'wrong', classify election data into 'fraudulent vote' or 'non-fraudulent vote', classify market move into 'long' or 'short' and so on. These are the examples of logistic regression having a binary output (also called dichotomous).
You can also think of logistic regression as a special case of linear regression when the outcome variable is categorical, where we are using log of odds as dependent variable. In simple words, it predicts the probability of occurrence of an event by fitting data to a logit function.
Basically, the theory behind Logistic Regression is very similar to the one from Linear Regression, where we seek to draw a best-fitting line over data points, but in Logistic Regression, we don’t directly fit a straight line to our data like in linear regression. Instead, we fit a S shaped curve, called Sigmoid, to our observations, that best SEPARATES data points. Technically speaking, the main goal of building the model is to find the parameters (weights) using gradient descent.
In this script the LR algorithm is retrained on each new bar trying to classify it into one of the two categories. This is done via the logistic_regression function by updating the weights w in the loop that continues for iterations number of times. In the end the weights are passed through the sigmoid function, yielding a prediction.
Mind that some assets require to modify the script's input parameters. For instance, when used with BTCUSD and USDJPY, the 'Normalization Lookback' parameter should be set down to 4 (2,...,5..), and optionally the 'Use Price Data for Signal Generation?' parameter should be checked. The defaults were tested with EURUSD.
Note: TradingViews's playback feature helps to see this strategy in action.
Warning: Signals ARE repainting.
Style tags: Trend Following, Trend Analysis
Asset class: Equities, Futures, ETFs, Currencies and Commodities
Dataset: FX Minutes/Hours/Days