Machine Learning: Lorentzian Classification█ OVERVIEW
A Lorentzian Distance Classifier (LDC) is a Machine Learning classification algorithm capable of categorizing historical data from a multi-dimensional feature space. This indicator demonstrates how Lorentzian Classification can also be used to predict the direction of future price movements when used as the distance metric for a novel implementation of an Approximate Nearest Neighbors (ANN) algorithm.
█ BACKGROUND
In physics, Lorentzian space is perhaps best known for its role in describing the curvature of space-time in Einstein's theory of General Relativity (2). Interestingly, however, this abstract concept from theoretical physics also has tangible real-world applications in trading.
Recently, it was hypothesized that Lorentzian space was also well-suited for analyzing time-series data (4), (5). This hypothesis has been supported by several empirical studies that demonstrate that Lorentzian distance is more robust to outliers and noise than the more commonly used Euclidean distance (1), (3), (6). Furthermore, Lorentzian distance was also shown to outperform dozens of other highly regarded distance metrics, including Manhattan distance, Bhattacharyya similarity, and Cosine similarity (1), (3). Outside of Dynamic Time Warping based approaches, which are unfortunately too computationally intensive for PineScript at this time, the Lorentzian Distance metric consistently scores the highest mean accuracy over a wide variety of time series data sets (1).
Euclidean distance is commonly used as the default distance metric for NN-based search algorithms, but it may not always be the best choice when dealing with financial market data. This is because financial market data can be significantly impacted by proximity to major world events such as FOMC Meetings and Black Swan events. This event-based distortion of market data can be framed as similar to the gravitational warping caused by a massive object on the space-time continuum. For financial markets, the analogous continuum that experiences warping can be referred to as "price-time".
Below is a side-by-side comparison of how neighborhoods of similar historical points appear in three-dimensional Euclidean Space and Lorentzian Space:
This figure demonstrates how Lorentzian space can better accommodate the warping of price-time since the Lorentzian distance function compresses the Euclidean neighborhood in such a way that the new neighborhood distribution in Lorentzian space tends to cluster around each of the major feature axes in addition to the origin itself. This means that, even though some nearest neighbors will be the same regardless of the distance metric used, Lorentzian space will also allow for the consideration of historical points that would otherwise never be considered with a Euclidean distance metric.
Intuitively, the advantage inherent in the Lorentzian distance metric makes sense. For example, it is logical that the price action that occurs in the hours after Chairman Powell finishes delivering a speech would resemble at least some of the previous times when he finished delivering a speech. This may be true regardless of other factors, such as whether or not the market was overbought or oversold at the time or if the macro conditions were more bullish or bearish overall. These historical reference points are extremely valuable for predictive models, yet the Euclidean distance metric would miss these neighbors entirely, often in favor of irrelevant data points from the day before the event. By using Lorentzian distance as a metric, the ML model is instead able to consider the warping of price-time caused by the event and, ultimately, transcend the temporal bias imposed on it by the time series.
For more information on the implementation details of the Approximate Nearest Neighbors (ANN) algorithm used in this indicator, please refer to the detailed comments in the source code.
█ HOW TO USE
Below is an explanatory breakdown of the different parts of this indicator as it appears in the interface:
Below is an explanation of the different settings for this indicator:
General Settings:
Source - This has a default value of "hlc3" and is used to control the input data source.
Neighbors Count - This has a default value of 8, a minimum value of 1, a maximum value of 100, and a step of 1. It is used to control the number of neighbors to consider.
Max Bars Back - This has a default value of 2000.
Feature Count - This has a default value of 5, a minimum value of 2, and a maximum value of 5. It controls the number of features to use for ML predictions.
Color Compression - This has a default value of 1, a minimum value of 1, and a maximum value of 10. It is used to control the compression factor for adjusting the intensity of the color scale.
Show Exits - This has a default value of false. It controls whether to show the exit threshold on the chart.
Use Dynamic Exits - This has a default value of false. It is used to control whether to attempt to let profits ride by dynamically adjusting the exit threshold based on kernel regression.
Feature Engineering Settings:
Note: The Feature Engineering section is for fine-tuning the features used for ML predictions. The default values are optimized for the 4H to 12H timeframes for most charts, but they should also work reasonably well for other timeframes. By default, the model can support features that accept two parameters (Parameter A and Parameter B, respectively). Even though there are only 4 features provided by default, the same feature with different settings counts as two separate features. If the feature only accepts one parameter, then the second parameter will default to EMA-based smoothing with a default value of 1. These features represent the most effective combination I have encountered in my testing, but additional features may be added as additional options in the future.
Feature 1 - This has a default value of "RSI" and options are: "RSI", "WT", "CCI", "ADX".
Feature 2 - This has a default value of "WT" and options are: "RSI", "WT", "CCI", "ADX".
Feature 3 - This has a default value of "CCI" and options are: "RSI", "WT", "CCI", "ADX".
Feature 4 - This has a default value of "ADX" and options are: "RSI", "WT", "CCI", "ADX".
Feature 5 - This has a default value of "RSI" and options are: "RSI", "WT", "CCI", "ADX".
Filters Settings:
Use Volatility Filter - This has a default value of true. It is used to control whether to use the volatility filter.
Use Regime Filter - This has a default value of true. It is used to control whether to use the trend detection filter.
Use ADX Filter - This has a default value of false. It is used to control whether to use the ADX filter.
Regime Threshold - This has a default value of -0.1, a minimum value of -10, a maximum value of 10, and a step of 0.1. It is used to control the Regime Detection filter for detecting Trending/Ranging markets.
ADX Threshold - This has a default value of 20, a minimum value of 0, a maximum value of 100, and a step of 1. It is used to control the threshold for detecting Trending/Ranging markets.
Kernel Regression Settings:
Trade with Kernel - This has a default value of true. It is used to control whether to trade with the kernel.
Show Kernel Estimate - This has a default value of true. It is used to control whether to show the kernel estimate.
Lookback Window - This has a default value of 8 and a minimum value of 3. It is used to control the number of bars used for the estimation. Recommended range: 3-50
Relative Weighting - This has a default value of 8 and a step size of 0.25. It is used to control the relative weighting of time frames. Recommended range: 0.25-25
Start Regression at Bar - This has a default value of 25. It is used to control the bar index on which to start regression. Recommended range: 0-25
Display Settings:
Show Bar Colors - This has a default value of true. It is used to control whether to show the bar colors.
Show Bar Prediction Values - This has a default value of true. It controls whether to show the ML model's evaluation of each bar as an integer.
Use ATR Offset - This has a default value of false. It controls whether to use the ATR offset instead of the bar prediction offset.
Bar Prediction Offset - This has a default value of 0 and a minimum value of 0. It is used to control the offset of the bar predictions as a percentage from the bar high or close.
Backtesting Settings:
Show Backtest Results - This has a default value of true. It is used to control whether to display the win rate of the given configuration.
█ WORKS CITED
(1) R. Giusti and G. E. A. P. A. Batista, "An Empirical Comparison of Dissimilarity Measures for Time Series Classification," 2013 Brazilian Conference on Intelligent Systems, Oct. 2013, DOI: 10.1109/bracis.2013.22.
(2) Y. Kerimbekov, H. Ş. Bilge, and H. H. Uğurlu, "The use of Lorentzian distance metric in classification problems," Pattern Recognition Letters, vol. 84, 170–176, Dec. 2016, DOI: 10.1016/j.patrec.2016.09.006.
(3) A. Bagnall, A. Bostrom, J. Large, and J. Lines, "The Great Time Series Classification Bake Off: An Experimental Evaluation of Recently Proposed Algorithms." ResearchGate, Feb. 04, 2016.
(4) H. Ş. Bilge, Yerzhan Kerimbekov, and Hasan Hüseyin Uğurlu, "A new classification method by using Lorentzian distance metric," ResearchGate, Sep. 02, 2015.
(5) Y. Kerimbekov and H. Şakir Bilge, "Lorentzian Distance Classifier for Multiple Features," Proceedings of the 6th International Conference on Pattern Recognition Applications and Methods, 2017, DOI: 10.5220/0006197004930501.
(6) V. Surya Prasath et al., "Effects of Distance Measure Choice on KNN Classifier Performance - A Review." .
█ ACKNOWLEDGEMENTS
@veryfid - For many invaluable insights, discussions, and advice that helped to shape this project.
@capissimo - For open sourcing his interesting ideas regarding various KNN implementations in PineScript, several of which helped inspire my original undertaking of this project.
@RikkiTavi - For many invaluable physics-related conversations and for his helping me develop a mechanism for visualizing various distance algorithms in 3D using JavaScript
@jlaurel - For invaluable literature recommendations that helped me to understand the underlying subject matter of this project.
@annutara - For help in beta-testing this indicator and for sharing many helpful ideas and insights early on in its development.
@jasontaylor7 - For helping to beta-test this indicator and for many helpful conversations that helped to shape my backtesting workflow
@meddymarkusvanhala - For helping to beta-test this indicator
@dlbnext - For incredibly detailed backtesting testing of this indicator and for sharing numerous ideas on how the user experience could be improved.
Cari skrip untuk "algo"
Adaptive MA constructor [lastguru]Adaptive Moving Averages are nothing new, however most of them use EMA as their MA of choice once the preferred smoothing length is determined. I have decided to make an experiment and separate length generation from smoothing, offering multiple alternatives to be combined. Some of the combinations are widely known, some are not. This indicator is based on my previously published public libraries and also serve as a usage demonstration for them. I will try to expand the collection (suggestions are welcome), however it is not meant as an encyclopaedic resource, so you are encouraged to experiment yourself: by looking on the source code of this indicator, I am sure you will see how trivial it is to use the provided libraries and expand them with your own ideas and combinations. I give no recommendation on what settings to use, but if you find some useful setting, combination or application ideas (or bugs in my code), I would be happy to read about them in the comments section.
The indicator works in three stages: Prefiltering, Length Adaptation and Moving Averages.
Prefiltering is a fast smoothing to get rid of high-frequency (2, 3 or 4 bar) noise.
Adaptation algorithms are roughly subdivided in two categories: classic Length Adaptations and Cycle Estimators (they are also implemented in separate libraries), all are selected in Adaptation dropdown. Length Adaptation used in the Adaptive Moving Averages and the Adaptive Oscillators try to follow price movements and accelerate/decelerate accordingly (usually quite rapidly with a huge range). Cycle Estimators, on the other hand, try to measure the cycle period of the current market, which does not reflect price movement or the rate of change (the rate of change may also differ depending on the cycle phase, but the cycle period itself usually changes slowly).
Chande (Price) - based on Chande's Dynamic Momentum Index (CDMI or DYMOI), which is dynamic RSI with this length
Chande (Volume) - a variant of Chande's algorithm, where volume is used instead of price
VIDYA - based on VIDYA algorithm. The period oscillates from the Lower Bound up (slow)
VIDYA-RS - based on Vitali Apirine's modification of VIDYA algorithm (he calls it Relative Strength Moving Average). The period oscillates from the Upper Bound down (fast)
Kaufman Efficiency Scaling - based on Efficiency Ratio calculation originally used in KAMA
Deviation Scaling - based on DSSS by John F. Ehlers
Median Average - based on Median Average Adaptive Filter by John F. Ehlers
Fractal Adaptation - based on FRAMA by John F. Ehlers
MESA MAMA Alpha - based on MESA Adaptive Moving Average by John F. Ehlers
MESA MAMA Cycle - based on MESA Adaptive Moving Average by John F. Ehlers, but unlike Alpha calculation, this adaptation estimates cycle period
Pearson Autocorrelation* - based on Pearson Autocorrelation Periodogram by John F. Ehlers
DFT Cycle* - based on Discrete Fourier Transform Spectrum estimator by John F. Ehlers
Phase Accumulation* - based on Dominant Cycle from Phase Accumulation by John F. Ehlers
Length Adaptation usually take two parameters: Bound From (lower bound) and To (upper bound). These are the limits for Adaptation values. Note that the Cycle Estimators marked with asterisks(*) are very computationally intensive, so the bounds should not be set much higher than 50, otherwise you may receive a timeout error (also, it does not seem to be a useful thing to do, but you may correct me if I'm wrong).
The Cycle Estimators marked with asterisks(*) also have 3 checkboxes: HP (Highpass Filter), SS (Super Smoother) and HW (Hann Window). These enable or disable their internal prefilters, which are recommended by their author - John F. Ehlers. I do not know, which combination works best, so you can experiment.
Chande's Adaptations also have 3 additional parameters: SD Length (lookback length of Standard deviation), Smooth (smoothing length of Standard deviation) and Power (exponent of the length adaptation - lower is smaller variation). These are internal tweaks for the calculation.
Length Adaptaton section offer you a choice of Moving Average algorithms. Most of the Adaptations are originally used with EMA, so this is a good starting point for exploration.
SMA - Simple Moving Average
RMA - Running Moving Average
EMA - Exponential Moving Average
HMA - Hull Moving Average
VWMA - Volume Weighted Moving Average
2-pole Super Smoother - 2-pole Super Smoother by John F. Ehlers
3-pole Super Smoother - 3-pole Super Smoother by John F. Ehlers
Filt11 -a variant of 2-pole Super Smoother with error averaging for zero-lag response by John F. Ehlers
Triangle Window - Triangle Window Filter by John F. Ehlers
Hamming Window - Hamming Window Filter by John F. Ehlers
Hann Window - Hann Window Filter by John F. Ehlers
Lowpass - removes cyclic components shorter than length (Price - Highpass)
DSSS - Derivation Scaled Super Smoother by John F. Ehlers
There are two Moving Averages that are drown on the chart, so length for both needs to be selected. If no Adaptation is selected ( None option), you can set Fast Length and Slow Length directly. If an Adaptation is selected, then Cycle multiplier can be selected for Fast and Slow MA.
More information on the algorithms is given in the code for the libraries used. I am also very grateful to other TradingView community members (they are also mentioned in the library code) without whom this script would not have been possible.
Exotic SMA Explorations Treasure TroveThis is my "Exotic SMA Explorations Treasure Trove" intended for educational purposes, yet these functions will also have utility in special applications with other algorithms. Firstly, the Pine built-in sma() is exceedingly more efficient computationally on TV servers than these functions will be. I just wanted to make that very crystal clear. My notes elaborate on this in the code blatantly.
Anyhow, the simple moving average(SMA) is one of the most common averaging filters used in a wide variety of algorithms. "Simply put," it's name says a lot about it. The purpose of this script, is to demonstrate variations of it's calculation in a multitude of exotic forms. In certain scenarios our algorithms may require a specific mathemagical touch that is pertinent to our intended goals. Like screwdrivers, we often need different types depending on the objective we are trying to attain. The SMA also serves as the most basic of finite impulse response(FIR) algorithms. For example, things like weighted moving averages can be constructed by using the foundational code of SMA.
One other intended demonstration of this script, is running multiple functions for comparison. I have had to use this from time to time for my own comparisons of performance. Also, imbedded into this code is a method to generically and recklessly in this case, adapt an algorithm. I will warn you, RSI was NEVER intended to adapt an algorithm. It only serves as a crude method to display the versatility of these different algorithms, whether it be a benefit or hinderance concerning dynamic adaptability.
Lastly, this script shows the versatility of TV's NEW additions input(group=) and input(inline=) upgrades in action. The "Immense Power of Pine" is always evolving and will continue to do so, I assure you of that. We can now categorize our input()s without using the input(type=input.bool) hackTrick. Although, that still will have it's enduring versatility, at least for myself.
NOTICE: You have absolute freedom to use this source code any way you see fit within your new Pine projects. You don't have to ask for my permission to reuse these functions in your published scripts, simply because I have better things to do than answer requests for the reuse of these functions. Sufficient accreditation regarding this script and compliance with "TV's House Rules" regarding code reuse, is as easy as copying the functions in their entirety as is. Fair enough? Good!
When available time provides itself, I will consider your inquiries, thoughts, and concepts presented below in the comments section, should you have any questions or comments regarding this indicator. When my indicators achieve more prevalent use by TV members, I may implement more ideas when they present themselves as worthy additions. Have a profitable future everyone!
Machine Learning Gaussian Mixture Model | AlphaNattMachine Learning Gaussian Mixture Model | AlphaNatt
A revolutionary oscillator that uses Gaussian Mixture Models (GMM) with unsupervised machine learning to identify market regimes and automatically adapt momentum calculations - bringing statistical pattern recognition techniques to trading.
"Markets don't follow a single distribution - they're a mixture of different regimes. This oscillator identifies which regime we're in and adapts accordingly."
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
🤖 THE MACHINE LEARNING
Gaussian Mixture Models (GMM):
Unlike K-means clustering which assigns hard boundaries, GMM uses probabilistic clustering :
Models data as coming from multiple Gaussian distributions
Each market regime is a different Gaussian component
Provides probability of belonging to each regime
More sophisticated than simple clustering
Expectation-Maximization Algorithm:
The indicator continuously learns and adapts using the E-M algorithm:
E-step: Calculate probability of current market belonging to each regime
M-step: Update regime parameters based on new data
Continuous learning without repainting
Adapts to changing market conditions
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
🎯 THREE MARKET REGIMES
The GMM identifies three distinct market states:
Regime 1 - Low Volatility:
Quiet, ranging markets
Uses RSI-based momentum calculation
Reduces false signals in choppy conditions
Background: Pink tint
Regime 2 - Normal Market:
Standard trending conditions
Uses Rate of Change momentum
Balanced sensitivity
Background: Gray tint
Regime 3 - High Volatility:
Strong trends or volatility events
Uses Z-score based momentum
Captures extreme moves
Background: Cyan tint
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
💡 KEY INNOVATIONS
1. Probabilistic Regime Detection:
Instead of binary regime assignment, provides probabilities:
30% Regime 1, 60% Regime 2, 10% Regime 3
Smooth transitions between regimes
No sudden indicator jumps
2. Weighted Momentum Calculation:
Combines three different momentum formulas
Weights based on regime probabilities
Automatically adapts to market conditions
3. Confidence Indicator:
Shows how certain the model is (white line)
High confidence = strong regime identification
Low confidence = transitional market state
Line transparency changes with confidence
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
⚙️ PARAMETER OPTIMIZATION
Training Period (50-500):
50-100: Quick adaptation to recent conditions
100: Balanced (default)
200-500: Stable regime identification
Number of Components (2-5):
2: Simple bull/bear regimes
3: Low/Normal/High volatility (default)
4-5: More granular regime detection
Learning Rate (0.1-1.0):
0.1-0.3: Slow, stable learning
0.3: Balanced (default)
0.5-1.0: Fast adaptation
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
📊 TRADING STRATEGIES
Visual Signals:
Cyan gradient: Bullish momentum
Magenta gradient: Bearish momentum
Background color: Current regime
Confidence line: Model certainty
1. Regime-Based Trading:
Regime 1 (pink): Expect mean reversion
Regime 2 (gray): Standard trend following
Regime 3 (cyan): Strong momentum trades
2. Confidence-Filtered Signals:
Only trade when confidence > 70%
High confidence = clearer market state
Avoid transitions (low confidence)
3. Adaptive Position Sizing:
Regime 1: Smaller positions (choppy)
Regime 2: Normal positions
Regime 3: Larger positions (trending)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
🚀 ADVANTAGES OVER OTHER ML INDICATORS
vs K-Means Clustering:
Soft clustering (probabilities) vs hard boundaries
Captures uncertainty and transitions
More mathematically robust
vs KNN (K-Nearest Neighbors):
Unsupervised learning (no historical labels needed)
Continuous adaptation
Lower computational complexity
vs Neural Networks:
Interpretable (know what each regime means)
No overfitting issues
Works with limited data
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
📈 PERFORMANCE CHARACTERISTICS
Best Market Conditions:
Markets with clear regime shifts
Volatile to trending transitions
Multi-timeframe analysis
Cryptocurrency markets (high regime variation)
Key Strengths:
Automatically adapts to market changes
No manual parameter adjustment needed
Smooth transitions between regimes
Probabilistic confidence measure
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
🔬 TECHNICAL BACKGROUND
Gaussian Mixture Models are used extensively in:
Speech recognition (Google Assistant)
Computer vision (facial recognition)
Astronomy (galaxy classification)
Genomics (gene expression analysis)
Finance (risk modeling at investment banks)
The E-M algorithm was developed at Stanford in 1977 and is one of the most important algorithms in unsupervised machine learning.
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
💡 PRO TIPS
Watch regime transitions: Best opportunities often occur when regimes change
Combine with volume: High volume + regime change = strong signal
Use confidence filter: Avoid low confidence periods
Multi-timeframe: Compare regimes across timeframes
Adjust position size: Scale based on identified regime
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
⚠️ IMPORTANT NOTES
Machine learning adapts but doesn't predict the future
Best used with other confirmation indicators
Allow time for model to learn (100+ bars)
Not financial advice - educational purposes
Backtest thoroughly on your instruments
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
🏆 CONCLUSION
The GMM Momentum Oscillator brings institutional-grade machine learning to retail trading. By identifying market regimes probabilistically and adapting momentum calculations accordingly, it provides:
Automatic adaptation to market conditions
Clear regime identification with confidence levels
Smooth, professional signal generation
True unsupervised machine learning
This isn't just another indicator with "ML" in the name - it's a genuine implementation of Gaussian Mixture Models with the Expectation-Maximization algorithm, the same technology used in:
Google's speech recognition
Tesla's computer vision
NASA's data analysis
Wall Street risk models
"Let the machine learn the market regimes. Trade with statistical confidence."
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Developed by AlphaNatt | Machine Learning Trading Systems
Version: 1.0
Algorithm: Gaussian Mixture Model with E-M
Classification: Unsupervised Learning Oscillator
Not financial advice. Always DYOR.
SMC - Institutional Confidence Oscillator [PhenLabs]📊 Institutional Confidence Oscillator
Version: PineScript™v6
📌 Description
The Institutional Confidence Oscillator (ICO) revolutionizes market analysis by automatically detecting and evaluating institutional activity at key support and resistance levels using our own in-house detection system. This sophisticated indicator combines volume analysis, volatility measurements, and mathematical confidence algorithms to provide real-time readings of institutional sentiment and zone strength.
Using our advanced thin liquidity detection, the ICO identifies high-volume, narrow-range bars that signal institutional zone formation, then tracks how these zones perform under market pressure. The result is a dual-wave confidence oscillator that shows traders when institutions are actively defending price levels versus when they’re abandoning positions.
The indicator transforms complex institutional behavior patterns into clear, actionable confidence percentiles, helping traders align with smart money movements and avoid common retail trading pitfalls.
🚀 Points of Innovation
Automated thin liquidity zone detection using volume threshold multipliers and zone size filtering
Dual-sided confidence tracking for both support and resistance levels simultaneously
Sigmoid function processing for enhanced mathematical accuracy in confidence calculations
Real-time institutional defense pattern analysis through complete test cycles
Advanced visual smoothing options with multiple algorithmic methods (EMA, SMA, WMA, ALMA)
Integrated momentum indicators and gradient visualization for enhanced signal clarity
🔧 Core Components
Volume Threshold System: Analyzes volume ratios against baseline averages to identify institutional activity spikes
Zone Detection Algorithm: Automatically identifies thin liquidity zones based on customizable volume and size parameters
Confidence Lifecycle Engine: Tracks institutional defense patterns through complete observation windows
Mathematical Processing Core: Uses sigmoid functions to convert raw market data into normalized confidence percentiles
Visual Enhancement Suite: Provides multiple smoothing methods and customizable display options for optimal chart interpretation
🔥 Key Features
Auto-Detection Technology: Automatically scans for institutional zones without manual intervention, saving analysis time
Dual Confidence Tracking: Simultaneously monitors both support and resistance institutional activity for comprehensive market view
Smart Zone Validation: Evaluates zone strength through volume analysis, adverse excursion measurement, and defense success rates
Customizable Parameters: Extensive input options for volume thresholds, observation windows, and visual preferences
Real-Time Updates: Continuously processes market data to provide current institutional confidence readings
Enhanced Visualization: Features gradient fills, momentum indicators, and information panels for clear signal interpretation
🎨 Visualization
Dual Oscillator Lines: Support confidence (cyan) and resistance confidence (red) plotted as percentage values 0-100%
Gradient Fill Areas: Color-coded regions showing confidence dominance and strength levels
Reference Grid Lines: Horizontal markers at 25%, 50%, and 75% levels for easy interpretation
Information Panel: Real-time display of current confidence percentiles with color-coded dominance indicators
Momentum Indicators: Rate of change visualization for confidence trends
Background Highlights: Extreme confidence level alerts when readings exceed 80%
📖 Usage Guidelines
Auto-Detection Settings
Use Auto-Detection
Default: true
Description: Enables automatic thin liquidity zone identification based on volume and size criteria
Volume Threshold Multiplier
Default: 6.0, Range: 1.0+
Description: Controls sensitivity of volume spike detection for zone identification, higher values require more significant volume increases
Volume MA Length
Default: 15, Range: 1+
Description: Period for volume moving average baseline calculation, affects volume spike sensitivity
Max Zone Height %
Default: 0.5%, Range: 0.05%+
Description: Filters out wide price bars, keeping only thin liquidity zones as percentage of current price
Confidence Logic Settings
Test Observation Window
Default: 20 bars, Range: 2+
Description: Number of bars to monitor zone tests for confidence calculation, longer windows provide more stable readings
Clean Break Threshold
Default: 1.5 ATR, Range: 0.1+
Description: ATR multiple required for zone invalidation, higher values make zones more persistent
Visual Settings
Smoothing Method
Default: EMA, Options: SMA/EMA/WMA/ALMA
Description: Algorithm for signal smoothing, EMA responds faster while SMA provides more stability
Smoothing Length
Default: 5, Range: 1-50
Description: Period for smoothing calculation, higher values create smoother lines with more lag
✅ Best Use Cases
Trending market analysis where institutional zones provide reliable support/resistance levels
Breakout confirmation by validating zone strength before position entry
Divergence analysis when confidence shifts between support and resistance levels
Risk management through identification of high-confidence institutional backing
Market structure analysis for understanding institutional sentiment changes
⚠️ Limitations
Performs best in liquid markets with clear institutional participation
May produce false signals during low-volume or holiday trading periods
Requires sufficient price history for accurate confidence calculations
Confidence readings can fluctuate rapidly during high-impact news events
Manual fallback zones may not reflect actual institutional activity
💡 What Makes This Unique
Automated Detection: First Pine Script indicator to automatically identify thin liquidity zones using sophisticated volume analysis
Dual-Sided Analysis: Simultaneously tracks institutional confidence for both support and resistance levels
Mathematical Precision: Uses sigmoid functions for enhanced accuracy in confidence percentage calculations
Real-Time Processing: Continuously evaluates institutional defense patterns as market conditions change
Visual Innovation: Advanced smoothing options and gradient visualization for superior chart clarity
🔬 How It Works
1. Zone Identification Process:
Scans for high-volume bars that exceed the volume threshold multiplier
Filters bars by maximum zone height percentage to identify thin liquidity conditions
Stores qualified zones with proximity threshold filtering for relevance
2. Confidence Calculation Process:
Monitors price interaction with identified zones during observation windows
Measures volume ratios and adverse excursions during zone tests
Applies sigmoid function processing to normalize raw data into confidence percentiles
3. Real-Time Analysis Process:
Continuously updates confidence readings as new market data becomes available
Tracks institutional defense success rates and zone validation patterns
Provides visual and numerical feedback through the oscillator display
💡 Note:
The ICO works best when combined with traditional technical analysis and proper risk management. Higher confidence readings indicate stronger institutional backing but should be confirmed with price action and volume analysis. Consider using multiple timeframes for comprehensive market structure understanding.
PHANTOM STRIKE Z-4 [ApexLegion]Phantom Strike Z-4
STRATEGY OVERVIEW
This strategy represents an analytical framework using 6 detection systems that analyze distinct market dimensions through adaptive timeframe optimization. Each system targets specific market inefficiencies - automated parameter adjustment, market condition filtering, phantom strike pattern detection, SR exit management, order block identification, and volatility-aware risk management - with results processed through a multi-component scoring calculation that determines signal generation and position management decisions.
SYSTEM ARCHITECTURE PHILOSOPHY
Phantom Strike Z-4 operates through 12 distinct parameter groups encompassing individual settings that allow detailed customization for different trading environments. The strategy employs modular design principles where each analytical component functions independently while contributing to unified decision-making protocols. This architecture enables traders to engage with structured market analysis through intuitive configuration options while the underlying algorithms handle complex computational processes.
The framework approaches certain aspects differently from static trading approaches by implementing real-time parameter adjustment based on timeframe characteristics, market volatility conditions, news event detection, and weekend gap analysis. During low-volatility periods where traditional strategies struggle to generate meaningful returns, Z-4's adaptive systems identify micro-opportunities through formation analysis and systematic patience protocols.
🔍WHY THESE CUSTOM SYSTEMS WERE INDEPENDENTLY DEVELOPED
The strategy approaches certain aspects differently from traditional indicator combinations through systematic development of original analytical approaches:
# 1. Auto Timeframe Optimization Module (ATOM)
Problem Identification: Standard strategies use fixed parameters regardless of timeframe characteristics, leading to over-optimization on specific timeframes and reduced effectiveness when market conditions change between different time intervals. Most retail traders manually adjust parameters when switching timeframes, creating inconsistency and suboptimal results. Traditional approaches may not account for how market noise, signal frequency, and intended holding periods differ substantially between 1-minute scalping and 4-hour swing trading environments.
Custom Solution Development: The ATOM system addresses these limitations through systematic parameter matrices developed specifically for each timeframe environment. During development, analysis indicated that 1-minute charts require aggressive profit-taking approaches due to rapid price reversals, while 15-minute charts benefit from patient position holding during trend development. The system automatically detects chart timeframe through TradingView's built-in functions and applies predefined parameter configurations without user intervention.
Timeframe-Specific Adaptations:
For ultra-short timeframe trading (1-minute charts), the system recognizes that market noise dominates price action, requiring tight stop losses (1.0%) and rapid profit realization (25% at TP1, 35% at TP2, 40% at TP3). Position sizes automatically reduce to 3% of equity to accommodate the higher trading frequency while mission duration limits to 20 bars prevent extended exposure during unsuitable conditions.
Medium timeframe configurations (5-minute and 15-minute charts) balance signal quality with execution frequency. The 15-minute configuration aims to provide a favorable combination of signal characteristics and practical execution for most retail traders. Formation thresholds increase to 2.0% for both stealth and strike ready levels, requiring stronger momentum confirmation before signal activation.
Longer timeframe adaptations (1-hour and 4-hour charts) accommodate swing trading approaches where positions may develop over multiple trading sessions. Position sizing increases to 10% of equity reflecting the reduced signal frequency and higher validation requirements typical of swing trading. Take profit targets extend considerably (TP1: 2.0%, TP2: 4.0%, TP3: 8.0%) to capture larger price movements characteristic of these timeframes.
# 2. Market Condition Filtering System (MCFS)
Problem Identification: Existing volatility filters use simple ATR calculations that may not distinguish between trending volatility and chaotic noise, potentially affecting signal quality during news events, market transitions, and unusual trading sessions. Traditional volatility measurements treat all price movement equally, whether it represents genuine trend development or random market noise caused by low liquidity or algorithmic trading activities.
Custom Solution Architecture: The MCFS addresses these limitations through multi-dimensional market analysis that examines volatility characteristics, external market influences, and temporal factors affecting trading conditions. Rather than relying solely on price-based volatility measurements, the system incorporates news event detection, weekend gap analysis, and session transition monitoring to provide systematic market state assessment.
Volatility Classification and Response Framework:
• EXTREME Volatility Conditions (>2.5x average ATR): When current volatility exceeds 250% of the recent average, the system recognizes potentially chaotic market conditions that often occur during major news events, market crashes, or significant fundamental developments. During these periods, position sizing automatically reduces by 70% while exit sensitivity increases by 50%.
• HIGH Volatility Conditions (1.8-2.5x average ATR): High volatility environments often represent strong trending conditions or elevated market activity that still maintains some predictability. Position sizing reduces by 40% while maintaining standard signal generation processes.
• NORMAL Volatility Conditions (1.2-1.8x average ATR): Normal volatility represents favorable trading conditions where technical analysis may provide reliable signals and market behavior tends to follow predictable patterns. All strategy parameters operate at standard settings.
• LOW Volatility Conditions (0.8-1.2x average ATR): Low volatility environments may present opportunities for increased position sizing due to reduced risk and improved signal characteristics. Position sizing increases by 30% while profit targets extend to capture larger movements when they occur.
• DEAD Volatility Conditions (<0.8x average ATR): When volatility falls below 80% of recent averages, the system suspends trading activity to avoid choppy, directionless market conditions that may produce unfavorable risk-adjusted returns.
# 3. Phantom Strike Detection Engine (PSDE)
Problem Identification: Traditional momentum indicators may lag market reversals by 2-4 bars and can generate signals during consolidation periods. Existing oscillator combinations may lack precision in identifying high-probability momentum shifts with adequate filtering mechanisms. Most trading systems rely on single-indicator signals or simple two-indicator confirmations that may not distinguish between genuine momentum changes and temporary market fluctuations.
Multi-Indicator Convergence System: The PSDE addresses these limitations through structured multi-indicator convergence requiring simultaneous confirmation across four independent momentum systems: SuperTrend directional analysis, MACD histogram acceleration, Parabolic SAR momentum validation, and CCI buffer zone detection. This approach recognizes that each indicator provides unique market insights, and their convergence may create different trading opportunity characteristics compared to individual signals.
Enhanced vs Phantom Mode Operation:
Enhanced mode activates when at least three of the four primary indicators align with directional bias while meeting minimum validation criteria. Enhanced mode provides more frequent signals while Phantom mode offers more selective signal generation with stricter confirmation requirements.
Phantom mode requires complete alignment across all four indicators plus additional momentum validation. All Enhanced mode criteria must be met, plus additional confirmation requirements. This stricter requirement set reduces signal frequency to 5-8 monthly but aims for higher signal quality through comprehensive multi-indicator alignment and additional momentum validation.
# 4. Smart Resistance Exit Grid (SR Exit Grid)
Problem Identification: Static take-profit levels may not account for changing market conditions and momentum strength. Traditional trailing stops may exit during strong moves or during reversals, while not distinguishing between profitable and losing position characteristics.
Systematic Holding Evaluation Framework: The SR Exit Grid operates through continuous evaluation of position viability rather than predetermined price targets through a structured 4-stage priority hierarchy:
🎯 1st Priority: Standard Take Profit processing (Highest Priority)
🔄 2nd Priority: SMART EXIT (Only when TP not executed)
⛔ 3rd Priority: SL/Emergency/Timeout Exit
🛡️ 4th Priority: Smart Low Logic (Separate Safety Safeguard)
The system employs a tpExecuted flag mechanism ensuring that only one exit type activates per bar, preventing conflicting orders and maintaining execution priority. Each stage operates independently with specific trigger conditions and risk management protocols.
Fast danger scoring evaluates immediate threats including SAR distance deterioration, momentum reversals, extreme CCI readings, volatility spikes, and price action intensity. When combined scores exceed specified thresholds (8.0+ danger with <2.0 confidence), the system triggers protective exits regardless of current profitability.
# 5. Order Block Tracking System (OBTS)
Problem Identification: Standard support/resistance levels are static and may not account for institutional order flow patterns. Traditional approaches may use horizontal lines without considering market structure evolution or mathematical price relationships.
Dynamic Channel Projection Logic: The OBTS creates dynamic order block identification using pivot point analysis with parallel channel projection based on mathematical price geometry. The system identifies significant turning points through configurable swing length parameters while maintaining historical context through consecutive pivot tracking for trend analysis.
Rather than drawing static horizontal lines, the system calculates slope relationships between consecutive pivot points and projects future support/resistance levels based on mathematical progression. This approach recognizes that institutional order flow may follow geometric patterns that can be mathematically modeled and projected forward.
# 6. Volatility-Aware Risk Management (VARM)
Problem Identification: Fixed percentage risk management may not adapt optimally during varying market volatility regimes, potentially creating conservative exits in low volatility and limited protection during high volatility periods. Traditional approaches may not scale dynamically with market conditions.
Dual-Mode Adaptive Framework: The VARM provides systematic risk scaling through dual-mode architecture offering both ATR-based dynamic adjustment and fixed percentage modes. Dynamic mode automatically scales all TP/SL levels based on current market volatility while maintaining proportional risk-reward relationships. Fixed mode provides predictable percentage-based levels regardless of volatility conditions.
Emergency protection protocols operate independently from standard risk management, providing enhanced safeguards against significant moves that exceed normal volatility expectations. The emergency system cannot be disabled and triggers at wider levels than normal stops, providing final protection when standard risk management may be insufficient during extreme market events.
## Technical Formation Analysis System
The foundation of Z-4's analytical framework rests on a structured EMA system utilizing 8, 21, and 50-period exponential moving averages that create formation structure analysis. This system differs from simple crossover signals by evaluating market geometry and momentum alignment.
Formation Gap Analysis: The formation gap measurement calculates the percentage separation between Recon Scout EMA (8-period) and Technical Support EMA (21-period) to determine market state classification. When gap percentage falls below the Stealth Mode Threshold (default 1.5%), the market enters consolidation phase requiring enhanced patience. When gap exceeds Strike Ready Threshold (1.5%), conditions become favorable for momentum-based entries.
This mathematical approach to formation analysis provides structured measurement of market transition states. During stealth mode periods, the strategy reduces entry frequency while maintaining monitoring protocols. Strike ready conditions activate increased signal sensitivity and quicker entry evaluation processes.
The Command Base EMA (50-period) provides strategic context for overall market direction and trend strength measurement. Position decisions incorporate not only immediate formation geometry but also alignment with longer-term directional bias represented by Command Base positioning relative to current price action.
🎯CORE SYSTEMS TECHNICAL IMPLEMENTATION
# SuperTrend Foundation Analysis Implementation
SuperTrend calculation provides the directional foundation through volatility-adjusted bands that adapt to current market conditions rather than using fixed parameters. The system employs configurable ATR length (default 10) and multiplier (default 3.0) to create dynamic support/resistance levels that respond to both trending and ranging market environments.
Volatility-Adjusted Band Calculation:
st_atr = ta.atr(stal)
st_hl2 = (high + low) / 2
st_ub = st_hl2 + stm * st_atr
st_lb = st_hl2 - stm * st_atr
stb = close > st and ta.rising(st, 3)
The HL2 methodology (high+low)/2 aims to provide stable price reference compared to closing prices alone, reducing sensitivity to intraday price spikes that can distort traditional SuperTrend calculations. ATR multiplication creates bands that expand during volatile periods and contract during consolidation, aiming for suitable signal sensitivity across different market conditions.
Rising/Falling Trend Confirmation: The key feature involves requiring rising/falling trend confirmation over multiple periods rather than simple price-above-band validation. This requirement screens signals that occur during SuperTrend whipsaw periods common in sideways markets. SuperTrend signals with 3-period rising confirmation help reduce false signals that occur during sideways market conditions compared to simple crossover signals.
Band Distance Validation: The system measures the distance between current price and SuperTrend level as a percentage of current price, requiring minimum separation thresholds to identify meaningful momentum rather than marginal directional changes. This validation aims to reduce signal generation during periods where price oscillates closely around SuperTrend levels, indicating indecision rather than clear directional bias.
# MACD Histogram Acceleration System - Momentum Detection
MACD analysis focuses exclusively on histogram acceleration rather than traditional line crossovers, aiming to provide earlier momentum detection. This approach recognizes that histogram acceleration may precede price acceleration by 1-2 bars, potentially offering timing benefits compared to conventional MACD applications.
Acceleration-Based Signal Generation:
mf = ta.ema(close, mfl)
ms = ta.ema(close, msl)
ml = mf - ms
msg = ta.ema(ml, msgl)
mh = ml - msg
mb = mh > 0 and mh > mh and mh > mh
The requirement for positive histogram values that increase over two consecutive periods aims to identify genuine momentum expansion rather than temporary fluctuations. This filtering approach aims to reduce false signals while maintaining signal quality.
Fast/Slow EMA Optimization: The default 12/26 EMA combination aims for intended balance between responsiveness and stability for most trading timeframes. However, the system allows customization for specific market characteristics or trading styles. Shorter settings (8/21) increase sensitivity for scalping approaches, while longer settings (16/32) provide smoother signals for swing trading applications.
Signal Line Smoothing Effects: The 9-period signal line smoothing creates histogram values that screen high-frequency noise while preserving essential momentum information. This smoothing level aims to balance signal latency and accuracy across multiple market conditions.
# Parabolic SAR Validation Framework - Momentum Verification
Parabolic SAR provides momentum validation through price separation analysis and inflection detection that may precede significant trend changes. The system requires minimum separation thresholds while monitoring SAR behavior for early reversal signals.
Separation-Based Validation:
sar = ta.sar(ss, si, sm)
sarb = close > sar and (close - sar) / close > 0.005
sardp = math.abs(close - sar) / close * 100
sariu = sarm > 0 and sarm < 0 and math.abs(sarmc) > saris
The 0.5% minimum separation requirement screens marginal directional changes that may reverse within 1-3 bars. The 0.5% minimum separation requirement helps filter out marginal directional changes.
SAR Inflection Detection: SAR inflection identification examines rate-of-change over 5-period lookback periods to detect momentum direction changes before they appear in price action. Inflection sensitivity (default 1.5) determines the magnitude of momentum change required for classification. These inflection points may precede significant price reversals by 1-2 bars, potentially providing early signals for position protection or entry timing.
Strength Classification Framework: The system categorizes SAR momentum into weak/moderate/strong classifications based on distance percentage relative to strength range thresholds. Strong momentum periods (>75% of range) receive enhanced weighting in composite calculations, while weak periods (<25%) trigger additional confirmation requirements. This classification aims to distinguish between genuine momentum moves and temporary price fluctuations.
# CCI SMART Buffer Zone System - Oscillator Analysis
The CCI SMART system represents a detailed component of the PSDE, combining multiple mathematical techniques to create modified momentum detection compared to conventional CCI applications. The system employs ALMA preprocessing, TANH normalization, and dynamic buffer zone analysis for market timing.
ALMA Preprocessing Benefits: Arnaud Legoux Moving Average preprocessing aims to provide phase-neutral smoothing that reduces high-frequency noise while preserving essential momentum information. The configurable offset (0.85) and sigma (6.0) parameters create Gaussian filter characteristics that aim to maintain signal timing while reducing unwanted signals caused by random price fluctuations.
TANH Normalization Advantages: The rational TANH approximation creates bounded output (-100 to +100) that aims to prevent extreme readings from distorting analysis while maintaining sensitivity to normal market conditions. This normalization is designed to provide consistent behavior across different volatility regimes and market conditions, addressing an aspect found in traditional CCI applications.
Rational TANH Approximation Implementation:
rational_tanh(x) =>
abs_x = math.abs(x)
if abs_x >= 4.0
x >= 0 ? 1.0 : -1.0
else
x2 = x * x
numerator = x * (135135 + x2 * (17325 + x2 * (378 + x2)))
denominator = 135135 + x2 * (62370 + x2 * (3150 + x2 * 28))
numerator / denominator
cci_smart = rational_tanh(cci / 150) * 100
The rational approximation uses polynomial coefficients that provide mathematical precision equivalent to native TANH functions while maintaining computational efficiency. The 4.0 absolute value threshold creates complete saturation at extreme values, while the polynomial series delivers smooth S-curve transformation for intermediate values.
Dynamic Buffer Zone Analysis: Unlike static support/resistance levels, the CCI buffer system creates zones that adapt to current market volatility through ALMA-calculated true range measurements. Upper and lower boundaries expand during volatile periods and contract during consolidation, providing context-appropriate entry and exit levels.
CCI Buffer System Implementation:
cci = ta.cci(close, ccil)
cci_atr = ta.alma(ta.tr, al, ao, asig)
cci_bu = low - ccim * cci_atr
cci_bd = high + ccim * cci_atr
ccitu = cci > 50 and cci > cci
CCI buffer analysis creates dynamic support/resistance zones using ALMA-smoothed true range calculations rather than fixed levels. Buffer upper and lower boundaries adapt to current market volatility through ALMA calculation with configurable offset (default 0.85) and sigma (default 6.0) parameters.
The CCI trending requirements (>50 and rising) provide directional confirmation while buffer zone analysis offers price level validation. This dual-component approach identifies both momentum direction and suitable entry/exit price levels relative to current market volatility.
# Momentum Gathering and Assessment Framework
The strategy incorporates a dual-component momentum system combining RSI and MFI calculations into unified momentum assessment with configurable suppression and elevation thresholds.
Composite Momentum Calculation:
ri = ta.rsi(close, mgp)
mi = ta.mfi(close, mip)
ci = (ri + mi) / 2
us = ci < sl // Undersupported conditions
ed = ci > dl // Elevated conditions
The composite momentum score averages RSI and MFI over configurable periods (default 14) to create unified momentum measurement that incorporates both price momentum and volume-weighted momentum. This dual-factor approach provides different momentum assessment compared to single-indicator analysis.
Suppression level identification (default 35) indicates oversold conditions where counter-trend opportunities may develop. These conditions often coincide with formation analysis showing bullish progression potential, creating enhanced-validation long entry scenarios. Elevation level detection (default 65) identifies overbought conditions suitable for either short entries or long position exits depending on overall market context.
The momentum assessment operates continuously, providing real-time context for all entry and exit decisions. Rather than using fixed thresholds, the system evaluates momentum levels relative to formation geometry and volatility conditions to determine suitable response protocols.
Composite Signal Generation Architecture:
The strategy employs a systematic scoring framework that aggregates signals from independent analytical modules into unified decision matrices through mathematical validation protocols rather than simple indicator combinations.
Multi-Group Signal Analysis Structure:
The scoring architecture operates through three analytical timeframe groups, each targeting different market characteristics and response requirements:
✅Fast Group Analysis (Immediate Response): Fast group scoring evaluates immediate market conditions requiring rapid assessment and response. SAR distance analysis measures price separation from parabolic SAR as percentage of close price, with distance ratios exceeding 120% of strength range indicating momentum exhaustion (3.0 points). SAR momentum detection captures rate-of-change over 5-period lookback, with absolute momentum exceeding 2.0% indicating notable acceleration or deceleration (1.0 point).
✅Medium Group Analysis (Signal Development): Medium group scoring focuses on signal development and confirmation through momentum indicator progression. Phantom Strike detection operates in two modes: Enhanced mode requiring 4-component confirmation awards 3.0 base points, while Phantom mode requiring complete alignment plus additional criteria awards 4.0 base points.
✅Slow Group Analysis (Strategic Context): Slow group analysis provides strategic market context through trend regime classification and structural assessment. Trend classification scoring awards top points (3.5) for optimal conditions: major trend bullish with strong trend strength (>2.0% EMA spread), 2.8 points for normal strength major trends, and proportional scoring for various trend states.
Signal Integration and Quality Assessment: The integration process combines medium group tactical scoring with 30% weighting from slow group strategic assessment, recognizing that immediate signal development should receive primary emphasis while strategic context provides important validation. Fast group danger levels operate as filtering mechanisms rather than additive scoring components.
Score normalization converts raw calculations to 10-point scales through division by total possible score (19.6) and multiplication by 10. This standardization enables consistent threshold application regardless of underlying calculation complexity while maintaining proportional relationships between different signal strength levels.
Conflict Resolution and Priority Logic:
sc = math.abs(cs_les - cs_ses) < 1.5
hqls = sql and not sc and (cs_les > cs_ses * 1.15)
hqss = sqs and not sc and (cs_ses > cs_les * 1.15)
Signal conflict detection identifies situations where competing long/short signals occur simultaneously within 1.5-point differential. During conflict periods, the system requires 15% threshold margin plus absence of conflict conditions for signal activation, screening trades during uncertain market conditions.
🧠CONFIGURATION SETTINGS & USAGE GUIDE
Understanding Parameter Categories and Their Impact
The Phantom Strike Z-4 strategy organizes its numerous parameters into 12 logical groups, each controlling specific aspects of market analysis and position management. Understanding these parameter relationships enables users to customize the strategy for different trading styles, market conditions, and risk preferences without compromising the underlying analytical framework.
Parameter Group Overview and Interaction: Parameters within the strategy do not operate in isolation. Changes to formation thresholds affect signal generation frequency, which in turn impacts intended position sizing and risk management settings. Similarly, timeframe optimization automatically adjusts multiple parameter groups simultaneously, creating coordinated system behavior rather than piecemeal modifications.
Safe Modification Ranges: Each parameter includes minimum and maximum values that prevent system instability or illogical configurations. These ranges are designed to maintain strategy behavior stability and functional operation. Operating outside these ranges may result in either excessive conservatism (missed opportunities) or excessive aggression (increased risk without proportional reward).
# Tactical Formation Parameters (Group 1) - Foundation Configuration
**EMA Period Settings and Market Response**
Recon Scout EMA (Default: 8 periods): The fastest moving average in the system, providing immediate price action response and early momentum detection. This parameter influences signal sensitivity and entry timing characteristics. Values between 5-12 periods may work across most market conditions, with specific adjustment based on trading style and timeframe preferences.
-Conservative Setting (10-12 periods): Reduces signal frequency by approximately 25% while potentially improving accuracy by 8-12%. Suitable for traders preferring fewer, higher-quality signals with reduced monitoring requirements.
-Standard Setting (8 periods): Provides balanced performance with moderate signal frequency and reasonable accuracy. Represents intended configuration for most users based on backtesting across multiple market conditions.
-Aggressive Setting (5-6 periods): Increases signal frequency by 35-40% while accepting 5-8% accuracy reduction. Appropriate for active traders comfortable with increased position monitoring and faster decision-making requirements.
Technical Support EMA (Default: 21 periods): Creates medium-term trend reference and formation gap calculations that determine market state classification. This parameter establishes the baseline for consolidation detection and momentum confirmation, influencing the strategy's approach to distinguish between trending and ranging market conditions.
Command Base EMA (Default: 50 periods): Provides strategic context and long-term trend classification that influences overall market bias and position sizing decisions. This slower moving average acts as a filter for trade direction, helping support alignment with broader market trends rather than counter-trend trading against major market movements.
**Formation Threshold Configuration**
Stealth Mode Threshold (Default: 1.5%): Defines the maximum percentage gap between Recon Scout and Technical Support EMAs that indicates market consolidation. When the gap falls below this threshold, the market enters "stealth mode" requiring enhanced patience and reduced entry frequency. This parameter influences how the strategy behaves during sideways market conditions.
-Tight Threshold (0.8-1.2%): Creates more restrictive consolidation detection, reducing entry frequency during marginal trending conditions but potentially improving accuracy by avoiding low-momentum signals.
-Standard Threshold (1.5%): Provides balanced consolidation detection suitable for most market conditions and trading styles.
-Loose Threshold (2.0-3.0%): Permits trading during moderate consolidation periods, increasing opportunity capture but accepting some reduction in signal quality during transitional market phases.
-Strike Ready Threshold (Default: 1.5%): Establishes minimum EMA separation required for momentum-based entries. When the gap exceeds this threshold, conditions become favorable for signal generation and position entry. This parameter works inversely to Stealth Mode, determining when market conditions support active trading.
# Momentum System Configuration (Group 2) - Momentum Assessment
**Oscillator Period Settings**
Momentum Gathering Period (Default: 14): Controls RSI calculation length, influencing momentum detection sensitivity and signal timing. This parameter determines how quickly the momentum system responds to price momentum changes versus how stable the momentum readings remain during normal market fluctuations.
-Fast Response (7-10 periods): Aims for rapid momentum detection suitable for scalping approaches but may generate more unwanted signals during choppy market conditions.
-Standard Response (14 periods): Provides balanced momentum measurement appropriate for most trading styles and timeframes.
-Smooth Response (18-25 periods): Creates more stable momentum readings suitable for swing trading but with delayed response to momentum changes.
-Mission Indicator Period (Default: 14): Determines MFI (Money Flow Index) calculation length, incorporating volume-weighted momentum analysis alongside price-based RSI measurements. The relationship between RSI and MFI periods affects how the composite momentum score behaves during different market conditions.
**Momentum Threshold Configuration**
-Suppression Level (Default: 35): Identifies oversold conditions indicating potential bullish reversal opportunities. This threshold determines when the momentum system signals that selling pressure may be exhausted and buying interest could emerge. Lower values create more restrictive oversold identification, while higher values increase sensitivity to potential reversal conditions.
-Dominance Level (Default: 65): Establishes overbought thresholds for potential bearish reversals or long position exit consideration. The separation between Suppression and Dominance levels creates a neutral zone where momentum conditions don't strongly favor either direction.
# Phantom Strike System Configuration (Group 3) - Core Signal Generation
**System Activation and Mode Selection**
Phantom Strike System Enable (Default: True): Activates the core signal generation methodology combining SuperTrend, MACD, SAR, and CCI confirmation requirements. Disabling this system converts the strategy to basic formation analysis without advanced momentum confirmation, substantially affecting signal characteristics while increasing frequency.
Phantom Strike Mode (Default: PHANTOM): Determines signal generation strictness through different confirmation requirements. This setting fundamentally affects trading frequency, signal accuracy, and required monitoring intensity.
ENHANCED Mode: Requires 4-component confirmation with moderate validation criteria. Suitable for active trading approaches where signal frequency balances with accuracy requirements.
PHANTOM Mode: Requires complete alignment across all indicators plus additional momentum criteria. Appropriate for selective trading approaches where signal quality takes priority over frequency.
**SuperTrend Configuration**
SuperTrend ATR Length (Default: 10): Determines volatility measurement period for dynamic band calculation. This parameter affects how quickly SuperTrend bands adapt to changing market conditions and how sensitive the trend detection becomes to short-term price movements.
SuperTrend Multiplier (Default: 3.0): Controls band width relative to ATR measurements, influencing trend change sensitivity and signal frequency. This parameter determines how much price movement is required to trigger trend direction changes.
**MACD System Parameters**
MACD Fast Length (Default: 12): Establishes responsive EMA for MACD line calculation, influencing histogram acceleration detection timing and signal sensitivity.
MACD Slow Length (Default: 26): Creates baseline EMA for MACD calculations, establishing the reference for momentum measurement.
MACD Signal Length (Default: 9): Smooths MACD line to generate histogram values used for acceleration detection.
**Parabolic SAR Settings**
SAR Start (Default: 0.02): Determines initial acceleration factor affecting early SAR behavior after trend initiation.
SAR Increment (Default: 0.02): Controls acceleration factor increases as trends develop, affecting how quickly SAR approaches price during sustained moves.
SAR Maximum (Default: 0.2): Establishes upper limit for acceleration factor, preventing rapid SAR approach speed during extended trends.
**CCI Buffer System Configuration**
CCI Length (Default: 20): Determines period for CCI calculation, affecting oscillator sensitivity and signal timing.
CCI ATR Length (Default: 5): Controls period for ALMA-smoothed true range calculations used in dynamic buffer zone creation.
CCI Multiplier (Default: 1.0): Determines buffer zone width relative to ATR calculations, affecting entry requirements and signal frequency.
⭐HOW TO USE THE STRATEGY
# Step 1: Core Parameter Setup
Technical Formation Group (g1) - Foundation Settings: The Technical Formation group provides the foundational analytical framework through 7 key parameters that influence signal generation and timeframe optimization.
Auto Optimization Controls:
enable_auto_tf = input.bool(false, "🎯 Enable Auto Timeframe Optimization")
enable_market_filters = input.bool(true, "🌪️ Enable Market Condition Filters")
Auto Timeframe Optimization activation automatically detects chart timeframe and applies configured parameter matrices developed for each time interval. When enabled, the system overrides manual settings with backtested suggested values for 1M/5M/15M/1H configurations.
Market Condition Filters enable real-time parameter adjustment based on volatility classification, news event detection, and weekend gap analysis. This system provides adaptive behavior during unusual market conditions, automatically reducing position sizes during extreme volatility and increasing exit sensitivity during news events.
# Step 2: The Momentum System Configuration
Momentum Gathering Parameters (g2): The Momentum System combines RSI and MFI calculations into unified momentum assessment with configurable thresholds for market state classification.
# Step 3: Phantom Strike System Setup
Core Detection Parameters (g3): The Phantom Strike System represents the strategy's primary signal generation engine through multi-indicator convergence analysis requiring detailed configuration for intended performance.
Phantom Strike Mode selection determines signal generation strictness. Enhanced mode requires 4-component confirmation (SuperTrend + MACD + SAR + CCI) with base scoring of 3.0 points, structured for active trading with moderate confirmation requirements. Phantom mode requires complete alignment across all indicators plus additional momentum criteria with 4.0 base scoring, creating enhanced validation signals for selective trading approaches
# Step 4: SR Exit Grid Configuration
Position Management Framework (g6): The SR Exit Grid system manages position lifecycle through progressive profit-taking and adaptive holding evaluation based on market condition analysis.
esr = input.bool(true, "Enable SR Exit Grid")
ept = input.bool(true, "Enable Partial Take Profit")
ets = input.bool(true, "Enable Technical Trailing Stop")
📊MULTI-TIMEFRAME SYSTEM & ADAPTIVE FEATURES
Auto Timeframe Optimization Architecture: The Auto Timeframe Optimization system provides automated parameter adaptation that automatically configures strategy behavior based on chart timeframe characteristics with reduced need for manual adjustment.
1-Minute Ultra Scalping Configuration:
get_1M_params() =>
StrategyParams.new(
smt = 0.8, srt = 1.0, mcb = 2, mmd = 20,
smartThreshold = 0.1, consecutiveLimit = 20,
positionSize = 3.0, enableQuickEntry = true,
ptp1 = 25, ptp2 = 35, ptp3 = 40,
tm1 = 1.5, tm2 = 3.0, tm3 = 4.5, tmf = 6.0,
isl = 1.0, esl = 2.0, tsd = 0.5, dsm = 1.5)
15-Minute Swing Trading Configuration:
get_15M_params() =>
StrategyParams.new(
smt = 2.0, srt = 2.0, mcb = 8, mmd = 100,
smartThreshold = 0.3, consecutiveLimit = 12,
positionSize = 7.0, enableQuickEntry = false,
ptp1 = 15, ptp2 = 25, ptp3 = 35,
tm1 = 4.0, tm2 = 8.0, tm3 = 12.0, tmf = 18.0,
isl = 2.0, esl = 3.5, tsd = 1.2, dsm = 2.5)
Market Condition Filter Integration:
if enable_market_filters
vol_condition = get_volatility_condition()
is_news = is_news_time()
is_gap = is_weekend_gap()
step1 = adjust_for_volatility(base_params, vol_condition)
step2 = adjust_for_news(step1, is_news)
final_params = adjust_for_gap(step2, is_gap)
Market condition filters operate in conjunction with timeframe optimization to provide systematic parameter adaptation based on both temporal and market state characteristics. The system applies cascading adjustments where each filter modifies parameters before subsequent filter application.
Volatility Classification Thresholds:
- EXTREME: >2.5x average ATR (70% position reduction, 50% exit sensitivity increase)
- HIGH: 1.8-2.5x average (40% position reduction, increased monitoring)
- NORMAL: 1.2-1.8x average (standard operations)
- LOW: 0.8-1.2x average (30% position increase, extended targets)
- DEAD: <0.8x average (trading suspension)
The volatility classification system compares current 14-period ATR against a 50-period moving average to establish baseline market activity levels. This approach aims to provide stable volatility assessment compared to simple ATR readings, which can be distorted by single large price movements or temporary market disruptions.
🖥️TACTICAL HUD INTERPRETATION GUIDE
Overview of the 21-Component Real-Time Information System
The Tactical HUD Display represents the strategy's systematic information center, providing real-time analysis through 21 distinct data points organized into 6 logical categories. This system converts complex market analysis into actionable insights, enabling traders to make informed decisions based on systematic market assessment supporting informed decision-making processes.
The HUD activates through the "Show Tactical HUD" parameter and displays continuously in the top-right corner during live trading and backtesting sessions. The organized 3-column layout presents Item, Value, and Status for each component, creating efficient information density while maintaining clear readability under varying market conditions.
# Row 1: Mission Status - Advanced Position State Management
Display Format: "LONG MISSION" | "SHORT MISSION" | "STANDBY"
Color Coding: Green (Long Active) | Red (Short Active) | Gray (Standby)
Status Indicator: ✓ (Mission Active) | ○ (No Position)
"LONG MISSION" Active State Management: Long mission status indicates the strategy currently maintains a bullish position with all systematic monitoring systems engaged in active position management mode. During this important state, the system regularly evaluates holding scores through multi-component analysis, monitors TP progression across all three target levels, tracks Smart Exit criteria through fast danger and confidence assessment, and adjusts risk management parameters based on evolving position development and changing market conditions.
"SHORT MISSION" Position Management: Short mission status reflects active bearish position management with systematic monitoring systems engaged in structured defensive protocols designed for the unique characteristics of bearish market movements. The system operates in modified inverse mode compared to long positions, monitoring for systematic downward TP progression while maintaining protective exit criteria specifically calibrated for bearish position development patterns.
"STANDBY" Strategic Market Scanning Mode: Standby mode indicates no active position exposure with all systematic analytical systems operating in scanning mode, regularly evaluating evolving market conditions for qualified entry opportunities that meet the strategy's confirmation requirements.
# Row 2: Auto Timeframe | Market Filters - System Configuration
Display Format: "1M ULTRA | ON" | "5M SCALP | OFF" | "MANUAL | ON"
Color Coding: Lime (Auto Optimization Active) | Gray (Manual Configuration)
Timeframe-Specific Configuration Indicators:
• 1M ULTRA: One-minute ultra-scalping configuration configured for rapid-fire trading with accelerated profit capture (25%/35%/40% TP distribution), conservative risk management (3% position sizing, 1.0% initial stops), and increased Smart Exit sensitivity (0.1 threshold, 20-bar consecutive limit).
• 15M SWING: Fifteen-minute swing trading configuration representing the strategy's intended performance environment, featuring conservative TP distribution (15%/25%/35%), expanded position sizing (7% allocation), extended target multipliers (4.0/8.0/12.0/18.0 ATR).
• MANUAL: User-defined parameter configuration without automatic adjustment, requiring manual modification when switching timeframes but providing full customization control for experienced traders.
Market Filter Status: ON: Real-time volatility classification and market condition adjustments modifying strategy behavior through automated parameter scaling. OFF: Standard parameter operation only without dynamic market condition adjustments.
# Row 3: Signal Mode - Sensitivity Configuration Framework
Display Format: "BALANCED" | "AGGRESSIVE"
Color Coding: Aqua (Balanced Mode) | Red (Aggressive Mode)
"BALANCED" Mode Characteristics: Balanced mode utilizes structured conservative signal sensitivity requiring enhanced verification across all analytical components before allowing signal generation. This rigorous configuration requires Medium Group scoring ≥5.5 points, Slow Group confirmation ≥3.5 points, and Fast Danger levels ≤2.0 points.
"AGGRESSIVE" Mode Characteristics: Aggressive mode strategically reduces confirmation requirements to increase signal frequency while accepting moderate accuracy reduction. Threshold requirements decrease to Medium Group ≥4.5 points, Slow Group ≥2.5 points, and Fast Danger ≤1.0 points.
# Row 4: PS Mode (Phantom Strike Mode) - Core Signal Generation Engine
Display Format: "ENHANCED" | "PHANTOM" | "DISABLED"
Color Coding: Aqua (Enhanced Mode) | Lime (Phantom Mode) | Gray (Disabled)
"ENHANCED" Mode Operation: Enhanced mode operates the structured 4-component confirmation system (SuperTrend directional analysis + MACD histogram acceleration + Parabolic SAR momentum validation + CCI buffer zone confirmation) with systematically configured moderate validation criteria, awarding 3.0 base points for signal strength calculation.
"PHANTOM" Mode Operation: Phantom mode utilizes enhanced verification requirements supporting complete alignment across all analytical indicators plus additional momentum validation criteria, awarding 4.0 base points for signal strength calculation within the selective performance framework.
# Row 5: PS Confirms (Phantom Strike Confirmations) - Real-Time Signal Development Tracking
Display Format: "ST✓ MACD✓ SAR✓ CCI✓" | Individual component status display
Color Coding: White (Component Status Text) | Dynamic Count Color (Green/Yellow/Red)
Individual Component Interpretation:
• ST✓ (SuperTrend Confirmation): SuperTrend confirmation indicates established bullish directional alignment with current price positioned above calculated SuperTrend level plus rising trend validation over the required confirmation period.
• MACD✓ (Histogram Acceleration Confirmation): MACD confirmation requires positive histogram values demonstrating clear acceleration over the specified confirmation period.
• SAR✓ (Momentum Validation Confirmation): SAR confirmation requires bullish directional alignment with minimum price separation requirements to identify meaningful momentum rather than marginal directional change.
• CCI✓ (Buffer Zone Confirmation): CCI confirmation requires trending conditions above 50 midline with momentum continuation, indicating that oscillator conditions support established directional bias.
# Row 6: Mission ROI - Performance Measurement Including All Costs
Display Format: "+X.XX%" | "-X.XX%" | "0.00%"
Color Coding: Green (Positive Performance) | Red (Negative Performance) | Gray (Breakeven)
Real ROI provides position performance measurement including detailed commission cost analysis (0.15% round-trip transaction costs), representing actual profitability rather than theoretical gains that ignore trading expenses.
# Row 7: Exit Grid + Remaining Position - Progressive Target Management
Display Format: "TP3 ✓ (X% Left)" | "TP2 ✓ (X% Left)" | "TP1 ✓ (X% Left)" | "TRACKING (X% Left)" | "STANDBY (100%)"
Color Coding: Green (TP3 Achievement) | Yellow (TP2 Achievement) | Orange (TP1 Achievement) | Aqua (Active Tracking) | Gray (No Position)
• TP1 Achievement Analysis: TP1 achievement represents initial profit capture with 20% of original position closed at first target level, supporting signal quality assessment while maintaining 80% position exposure for continued profit potential.
• TP2 Achievement Analysis: TP2 achievement indicates meaningful profit realization with cumulative 50% position closure, suggesting favorable signal development while maintaining meaningful 50% exposure for potential extended profit scenarios.
• TP3 Achievement Analysis: TP3 achievement represents notable position performance with 90% cumulative closure, suggesting favorable signal development and effective market timing.
# Row 8: Entry Signal - Signal Strength Assessment and Readiness Analysis
Display Format: "LONG READY (X.X/10)" | "SHORT READY (X.X/10)" | "WAITING (X.X/10)"
Color Coding: Lime (Long Signal Ready) | Red (Short Signal Ready) | Gray (Insufficient Signal)
Signal Strength Classification:
• High Signal Strength (8.0-10.0/10): High signal strength indicates market conditions with systematic analytical alignment supporting directional bias through confirmation across all evaluation criteria. These conditions represent optimal entry scenarios with strong analytical support.
• Strong Signal Quality (6.0-7.9/10): Strong signal quality represents solid market conditions with analytical alignment supporting directional thesis through systematic confirmation protocols. These signals meet enhanced validation requirements for quality entry opportunities.
• Moderate Signal Strength (4.5-5.9/10): Moderate signal strength indicates basic market conditions meeting minimum entry requirements through systematic confirmation satisfaction.
# Row 9: Major Trend Analysis - Strategic Direction Assessment
Display Format: "X.X% STRONG BULL" | "X.X% BULL" | "X.X% BEAR" | "X.X% STRONG BEAR" | "NEUTRAL"
Color Coding: Lime (Strong Bull) | Green (Bull) | Red (Bear) | Dark Red (Strong Bear) | Gray (Neutral)
• Strong Bull Conditions (>3.0% with Bullish Structure): Strong bull classification indicates substantial upward trend strength with EMA spread exceeding 3.0% combined with favorable bullish structure alignment. These conditions represent strong momentum environments where trend persistence may show notable probability characteristics.
• Standard Bull Conditions (1.5-3.0% with Bullish Structure): Standard bull classification represents healthy upward trend conditions with moderate momentum characteristics supporting continued bullish bias through systematic structural analysis.
# Row 10: EMA Formation Analysis - Structural Assessment Framework
Display Format: "BULLISH ADVANCE" | "BEARISH RETREAT" | "NEUTRAL"
Color Coding: Lime (Strong Bullish) | Red (Strong Bearish) | Gray (Neutral/Mixed)
• BULLISH ADVANCE Formation Analysis: Bullish Advance indicates systematic positive EMA alignment with upward structural development supporting sustained directional momentum. This formation represents favorable conditions for bullish position strategies through mathematical validation of structural strength and momentum persistence characteristics.
• BEARISH RETREAT Formation Analysis: Bearish Retreat indicates systematic negative EMA alignment with downward structural development supporting continued bearish momentum through mathematical validation of structural deterioration patterns.
# Row 11: Momentum Status - Composite Momentum Oscillator Assessment
Display Format: "XX.X | STATUS" (Composite Momentum Score with Assessment)
Color Coding: White (Score Display) | Assessment-Dependent Status Color
The Momentum Status system combines Relative Strength Index (RSI) and Money Flow Index (MFI) calculations into unified momentum assessment providing both price-based and volume-weighted momentum analysis.
• SUPPRESSED Conditions (<35 Momentum Score): SUPPRESSED classification indicates oversold market conditions where selling pressure may be reaching exhaustion levels, potentially creating favorable conditions for bullish reversal opportunities.
• ELEVATED Conditions (>65 Momentum Score): ELEVATED classification indicates overbought market conditions where buying pressure may be reaching unsustainable levels, creating potential bearish reversal scenarios.
# Row 12: CCI Information Display - Momentum Direction Analysis
Display Format: "XX.X | UP" | "XX.X | DOWN"
Color Coding: Lime (Bullish Momentum Trend) | Red (Bearish Momentum Trend)
The CCI Information Display showcases the CCI SMART system incorporating Arnaud Legoux Moving Average (ALMA) preprocessing combined with rational approximation of the hyperbolic tangent (TANH) function to achieve modified signal processing compared to traditional CCI implementations.
CCI Value Interpretation:
• Extreme Bullish Territory (>80): CCI readings exceeding +80 indicate extreme bullish momentum conditions with potential overbought characteristics requiring careful evaluation for continued position holding versus profit-taking consideration.
• Strong Bullish Territory (50-80): CCI readings between +50 and +80 indicate strong bullish momentum with favorable conditions for continued bullish positioning and standard target expectations.
• Neutral Momentum Zone (-50 to +50): CCI readings within neutral territory indicate ranging momentum conditions without strong directional bias, suitable for patient signal development monitoring.
• Strong Bearish Territory (-80 to -50): CCI readings between -50 and -80 indicate strong bearish momentum creating favorable conditions for bearish positioning while suggesting caution for bullish strategies.
• Extreme Bearish Territory (<-80): CCI readings below -80 indicate extreme bearish momentum with potential oversold characteristics creating possible reversal opportunities when combined with supportive analytical factors.
# Row 13: SAR Network - Multi-Component Momentum Analysis
Display Format: "X.XX% | BULL STRONG ↗INF" | Complex Multi-Component Analysis
Color Coding: Lime (Bullish Strong) | Green (Bullish Moderate) | Red (Bearish Strong) | Orange (Bearish Moderate) | White (Inflection Priority)
SAR Distance Percentage Analysis: The distance percentage component measures price separation from SAR level as percentage of current price, providing quantification of momentum strength through mathematical price relationship analysis.
SAR Strength Classification Framework:
• STRONG Momentum Conditions (>75% of Strength Range): STRONG classification indicates significant momentum conditions with price-SAR separation exceeding 75% of calculated strength range, representing notable directional movement with sustainability characteristics.
• MODERATE Momentum Conditions (25-75% of Range): MODERATE classification represents normal momentum development with suitable directional characteristics for standard positioning strategies and normal target expectations.
• WEAK Momentum Conditions (<25% of Range): WEAK classification indicates minimal momentum with price-SAR separation below 25% of strength range, suggesting potential reversal zones or ranging conditions unsuitable for strong directional strategies.
Inflection Detection System:
• Bullish Inflection (↗INF): Bullish inflection detection identifies moments when SAR momentum transitions from declining to rising through systematic rate-of-change analysis over 5-period lookback periods. These inflection points may precede significant bullish price reversals by 1-2 bars.
• Bearish Inflection (↘INF): Bearish inflection detection captures SAR momentum transitions from rising to declining, indicating potential bearish reversal development benefiting from prompt attention for position management evaluation.
# Row 14: VWAP Context Analysis - Institutional Volume-Weighted Price Reference
Display Format: "Daily: XXXX.XX (+X.XX%)" | "N/A (Index/Futures)"
Color Coding: Lime (Above VWAP Premium) | Red (Below VWAP Discount) | Gray (Data Unavailable)
Volume-Weighted Average Price (VWAP) provides institutional-level price reference showing mathematical average price where significant volume has transacted throughout the specified period. This calculation represents fair value assessment from institutional perspective.
• Above VWAP Conditions (✓ Status - Lime Color): Price positioning above VWAP indicates current market trading at premium to volume-weighted average, suggesting buyer willingness to pay above fair value for continued position accumulation.
• Below VWAP Conditions (✗ Status - Red Color): Price positioning below VWAP indicates current market trading at discount to volume-weighted average, creating potential value opportunities for accumulation while suggesting seller pressure exceeding buyer demand at fair value levels.
# Row 15: TP SL System Configuration - Dynamic vs Static Target Management
Display Format: "DYNAMIC ATR" | "STATIC %"
Color Coding: Aqua (Dynamic ATR Mode) | Yellow (Static Percentage Mode)
• DYNAMIC ATR Mode Analysis: Dynamic ATR mode implements systematic volatility-adaptive target management where all profit targets and stop losses automatically scale based on current market volatility through ATR (Average True Range) calculations. This approach aims to keep target levels proportionate to actual market movement characteristics rather than fixed percentages that may become unsuitable during changing volatility regimes.
• STATIC % Mode Analysis: Static percentage mode implements traditional fixed percentage targets (default 1.0%/2.5%/3.8%/4.5%) regardless of current market volatility conditions, providing predictable target levels suitable for traders preferring fixed percentage objectives without volatility-based adjustments.
# Row 16: TP Sequence Progression - Systematic Achievement Tracking
Display Format: "1 ✓ 2 ✓ 3 ○" | "1 ○ 2 ○ 3 ○" | Progressive Achievement Display
Color Coding: White text with systematic achievement progression
Status Indicator: ✓ (Achievement Confirmed) | ○ (Target Not Achieved)
• Complete Achievement Sequence (1 ✓ 2 ✓ 3 ✓): Complete sequence achievement represents significant position performance with systematic profit realization across all primary target levels, indicating favorable signal quality and effective market timing.
• Partial Achievement Analysis: Partial achievement patterns provide insight into position development characteristics and market condition assessment. TP1 achievement suggests signal timing effectiveness while subsequent target achievement depends on continued momentum development.
• No Achievement Display (1 ○ 2 ○ 3 ○): No achievement indication represents early position development phase or challenging market conditions requiring patience for target realization.
# Row 17: Mission Duration Tracking - Time-Based Position Management
Display Format: "XX/XXX" (Current Bars/Maximum Duration Limit)
Color Coding: Green (<50% Duration) | Orange (50-80% Duration) | Red (>80% Duration)
• Normal Duration Periods (Green Status <50%): Normal duration indicates position development within expected timeframes based on signal characteristics and market conditions, representing healthy position progression without time pressure concerns.
• Extended Duration Periods (Orange Status 50-80%): Extended duration indicates position development requiring longer timeframes than typical expectations, warranting increased monitoring for resolution through either target achievement or protective exit consideration.
• Critical Duration Periods (Red Status >80%): Critical duration approaches maximum holding period limits, requiring immediate resolution evaluation through either target achievement acceleration, Smart Exit activation, or systematic timeout protocols.
# Row 18: Last Exit Analysis - Historical Exit Pattern Assessment
Display Format: Exit Reason with Color-Coded Classification
Color Coding: Lime (TP Exits) | Red (Critical Exits) | Yellow (Stop Losses) | Purple (Smart Low) | Orange (Timeout/Sustained)
• Profit-Taking Exits (Lime/Green): TP1/TP2/TP3/Final Target exits indicate position management with systematic profit realization suggesting signal quality and strategy performance.
• Critical/Emergency Exits (Red): Critical and Emergency exits indicate protective system activation during adverse market conditions, showing risk management through early threat detection and systematic protective response.
• Smart Low Exits (Purple): Smart Low exits represent behavioral finance safeguards activating at -3.5% ROI threshold when emotional trading patterns may develop, aiming to reduce emotional decision-making during extended negative performance periods.
# Row 19: Fast Danger Assessment - Immediate Threat Detection System
Display Format: "X.X/10" (Danger Score out of 10)
Color Coding: Green (<3.0 Safe) | Yellow (3.0-5.0 Moderate) | Red (>5.0 High Danger)
The Fast Danger Assessment system provides real-time evaluation of immediate market threats through six independent measurement systems: SAR distance deterioration, momentum reversal detection, extreme CCI readings, volatility spike analysis, price action intensity, and combined threat evaluation.
• Safe Conditions (Green <3.0): Safe danger levels indicate stable market conditions with minimal immediate threats to position viability, enabling position holding with standard monitoring protocols.
• Moderate Concern (Yellow 3.0-5.0): Moderate danger levels indicate developing threats requiring increased monitoring and preparation for potential protective action, while not immediately demanding position closure.
• High Danger (Red >5.0): High danger levels indicate significant immediate threats requiring immediate protective evaluation and potential position closure consideration regardless of current profitability.
# Row 20: Holding Confidence Evaluation - Position Viability Assessment
Display Format: "X.X/10" (Confidence Score out of 10)
Color Coding: Green (>6.0 High Confidence) | Yellow (3.0-6.0 Moderate Confidence) | Red (<3.0 Low Confidence)
Holding Confidence evaluation provides systematic assessment of position viability through analysis of trend strength maintenance, formation quality persistence, momentum sustainability, and overall market condition favorability for continued position development.
• High Confidence (Green >6.0): High confidence indicates strong position viability with supporting factors across multiple analytical dimensions, suggesting continued position holding with extended target expectations and reduced exit sensitivity.
• Moderate Confidence (Yellow 3.0-6.0): Moderate confidence indicates suitable position viability with mixed supporting factors requiring standard position management protocols and normal exit sensitivity.
• Low Confidence (Red <3.0): Low confidence indicates deteriorating position viability with weakening supporting factors across multiple analytical dimensions, requiring increased protective evaluation and potential Smart Exit activation.
# Row 21: Volatility | Market Status - Volatility Environment & Market Filter Status
Display Format: "NORMAL | NORMAL" | "HIGH | HIGH VOL" | "EXTREME | NEWS FILTER"
Color Coding: White (Information display)
Volatility Classification Component (Left Side):
- DEAD: ATR ratio <0.8x average, minimal price movement requiring careful timing
- LOW: ATR ratio 0.8-1.2x average, stable conditions enabling position increase potential
- NORMAL: ATR ratio 1.2-1.8x average, typical market behavior with standard parameters
- HIGH: ATR ratio 1.8-2.5x average, elevated movement requiring increased caution
- EXTREME: ATR ratio >2.5x average, chaotic conditions triggering enhanced protection
Market Status Component (Right Side):
- NORMAL: Standard market conditions, no special filters active
- HIGH VOL: High volatility detected, position reduction and exit sensitivity increased
- EXTREME VOL: Extreme volatility confirmed, enhanced protective protocols engaged
- NEWS FILTER: Major economic event detected, 80% position reduction active
- GAP MODE: Weekend gap identified, increased caution until normal flow resumes
Combined Status Interpretation:
- NORMAL | NORMAL: Suitable trading conditions, standard strategy operation
- HIGH | HIGH VOL: Elevated volatility confirmed by both systems, 40% position reduction
- EXTREME | EXTREME VOL: High volatility warning, 70% position reduction active
📊VISUAL SYSTEM INTEGRATION
Chart Analysis & Market Visualization
CCI SMART Buffer Zone Visualization System - Dynamic Support/Resistance Framework
Dynamic Zone Architecture: The CCI SMART buffer system represents systematic visual integration creating adaptive support and resistance zones that automatically expand and contract based on current market volatility through ALMA-smoothed true range calculations. These dynamic zones provide real-time support and resistance levels that adapt to evolving market conditions rather than static horizontal lines that quickly become obsolete.
Adaptive Color Intensity Algorithm: The buffer visualization employs color intensity algorithms where transparency and saturation automatically adjust based on CCI momentum strength and directional persistence. Stronger momentum conditions produce more opaque visual representations with increased saturation, while weaker momentum creates subtle transparency indicating reduced prominence or significance.
Color Interpretation Framework for Strategic Decision Making:
-Intense Blue/Purple (High Opacity): Strong CCI readings exceeding ±80 with notable momentum strength indicating support/resistance zones suitable for increased position management decisions
• Moderate Blue/Purple (Medium Opacity): Standard CCI readings ranging ±40-80 with normal momentum indicating support/resistance areas for standard position management protocols
• Faded Blue/Purple (High Transparency): Weak CCI readings below ±40 with minimal momentum suggesting cautious interpretation and conservative position management approaches
• Dynamic Color Transitions: Automatic real-time shifts between bullish (blue spectrum) and bearish (purple spectrum) based on CCI trend direction and momentum persistence characteristics
CCI Inflection Circle System - Momentum Reversal Identification: The inflection detection system creates distinctive visual alerts through dual-circle design combining solid cores with transparent glow effects for enhanced visibility across different chart backgrounds and timeframe configurations.
Inflection Circle Classification:
• Neon Green Circles: CCI extreme bullish inflection detected (>80 threshold) with systematic core + glow effect indicating bearish reversal warning for position management evaluation
• Hot Pink Circles: CCI extreme bearish inflection detected (<-80 threshold) with dual-layer visualization indicating bullish reversal opportunity for strategic entry consideration
• Dual-Circle Design Architecture: Solid tiny core providing location identification with large transparent glow ensuring visibility without chart obstruction across multiple timeframe analyses
SAR Visual Network - Multi-Layer Momentum Display Architecture
SAR Visualization Framework: The SAR visual system implements structured multi-layer display architecture incorporating trend lines, strength classification markers, and momentum analysis through various visual elements that automatically adapt to current momentum conditions and strength characteristics.
SAR Strength Visual Classification System:
• Bright Triangles (High Intensity): Strong SAR momentum exceeding 75% of calculated strength range, indicating significant momentum quality suitable for increased positioning considerations and extended target scenarios
• Standard Circles (Medium Intensity): Moderate SAR momentum within 25-75% strength range, representing normal momentum development appropriate for standard positioning approaches and regular target expectations
• Faded Markers (Low Intensity): Weak SAR momentum below 25% strength range, suggesting caution and conservative positioning during minimal momentum conditions with increased exit sensitivity
⚠️IMPORTANT DISCLAIMERS AND RISK WARNINGS
Past Performance Limitations: The backtesting results presented represent hypothetical performance based on historical market data and do not guarantee future results. All trading involves substantial risk of loss. This strategy is provided for informational purposes and does not constitute financial advice. No trading strategy can guarantee 100% success or eliminate the risk of loss.
Users must approach trading with appropriate caution, never risking more than they can afford to lose.
Users are responsible for their own trading decisions, risk management, and compliance with applicable regulations in their jurisdiction.
Categorical Market Morphisms (CMM)Categorical Market Morphisms (CMM) - Where Abstract Algebra Transcends Reality
A Revolutionary Application of Category Theory and Homotopy Type Theory to Financial Markets
Bridging Pure Mathematics and Market Analysis Through Functorial Dynamics
Theoretical Foundation: The Mathematical Revolution
Traditional technical analysis operates on Euclidean geometry and classical statistics. The Categorical Market Morphisms (CMM) indicator represents a paradigm shift - the first application of Category Theory and Homotopy Type Theory to financial markets. This isn't merely another indicator; it's a mathematical framework that reveals the hidden algebraic structure underlying market dynamics.
Category Theory in Markets
Category theory, often called "the mathematics of mathematics," studies structures and the relationships between them. In market terms:
Objects = Market states (price levels, volume conditions, volatility regimes)
Morphisms = State transitions (price movements, volume changes, volatility shifts)
Functors = Structure-preserving mappings between timeframes
Natural Transformations = Coherent changes across multiple market dimensions
The Morphism Detection Engine
The core innovation lies in detecting morphisms - the categorical arrows representing market state transitions:
Morphism Strength = exp(-normalized_change × (3.0 / sensitivity))
Threshold = 0.3 - (sensitivity - 1.0) × 0.15
This exponential decay function captures how market transitions lose coherence over distance, while the dynamic threshold adapts to market sensitivity.
Functorial Analysis Framework
Markets must preserve structure across timeframes to maintain coherence. Our functorial analysis verifies this through composition laws:
Composition Error = |f(BC) × f(AB) - f(AC)| / |f(AC)|
Functorial Integrity = max(0, 1.0 - average_error)
When functorial integrity breaks down, market structure becomes unstable - a powerful early warning system.
Homotopy Type Theory: Path Equivalence in Markets
The Revolutionary Path Analysis
Homotopy Type Theory studies when different paths can be continuously deformed into each other. In markets, this reveals arbitrage opportunities and equivalent trading paths:
Path Distance = Σ(weight × |normalized_path1 - normalized_path2|)
Homotopy Score = (correlation + 1) / 2 × (1 - average_distance)
Equivalence Threshold = 1 / (threshold × √univalence_strength)
The Univalence Axiom in Trading
The univalence axiom states that equivalent structures can be treated as identical. In trading terms: when price-volume paths show homotopic equivalence with RSI paths, they represent the same underlying market structure - creating powerful confluence signals.
Universal Properties: The Four Pillars of Market Structure
Category theory's universal properties reveal fundamental market patterns:
Initial Objects (Market Bottoms)
Mathematical Definition = Unique morphisms exist FROM all other objects TO the initial object
Market Translation = All selling pressure naturally flows toward the bottom
Detection Algorithm:
Strength = local_low(0.3) + oversold(0.2) + volume_surge(0.2) + momentum_reversal(0.2) + morphism_flow(0.1)
Signal = strength > 0.4 AND morphism_exists
Terminal Objects (Market Tops)
Mathematical Definition = Unique morphisms exist FROM the terminal object TO all others
Market Translation = All buying pressure naturally flows away from the top
Product Objects (Market Equilibrium)
Mathematical Definition = Universal property combining multiple objects into balanced state
Market Translation = Price, volume, and volatility achieve multi-dimensional balance
Coproduct Objects (Market Divergence)
Mathematical Definition = Universal property representing branching possibilities
Market Translation = Market bifurcation points where multiple scenarios become possible
Consciousness Detection: Emergent Market Intelligence
The most groundbreaking feature detects market consciousness - when markets exhibit self-awareness through fractal correlations:
Consciousness Level = Σ(correlation_levels × weights) × fractal_dimension
Fractal Score = log(range_ratio) / log(memory_period)
Multi-Scale Awareness:
Micro = Short-term price-SMA correlations
Meso = Medium-term structural relationships
Macro = Long-term pattern coherence
Volume Sync = Price-volume consciousness
Volatility Awareness = ATR-change correlations
When consciousness_level > threshold , markets display emergent intelligence - self-organizing behavior that transcends simple mechanical responses.
Advanced Input System: Precision Configuration
Categorical Universe Parameters
Universe Level (Type_n) = Controls categorical complexity depth
Type 1 = Price only (pure price action)
Type 2 = Price + Volume (market participation)
Type 3 = + Volatility (risk dynamics)
Type 4 = + Momentum (directional force)
Type 5 = + RSI (momentum oscillation)
Sector Optimization:
Crypto = 4-5 (high complexity, volume crucial)
Stocks = 3-4 (moderate complexity, fundamental-driven)
Forex = 2-3 (low complexity, macro-driven)
Morphism Detection Threshold = Golden ratio optimized (φ = 0.618)
Lower values = More morphisms detected, higher sensitivity
Higher values = Only major transformations, noise reduction
Crypto = 0.382-0.618 (high volatility accommodation)
Stocks = 0.618-1.0 (balanced detection)
Forex = 1.0-1.618 (macro-focused)
Functoriality Tolerance = φ⁻² = 0.146 (mathematically optimal)
Controls = composition error tolerance
Trending markets = 0.1-0.2 (strict structure preservation)
Ranging markets = 0.2-0.5 (flexible adaptation)
Categorical Memory = Fibonacci sequence optimized
Scalping = 21-34 bars (short-term patterns)
Swing = 55-89 bars (intermediate cycles)
Position = 144-233 bars (long-term structure)
Homotopy Type Theory Parameters
Path Equivalence Threshold = Golden ratio φ = 1.618
Volatile markets = 2.0-2.618 (accommodate noise)
Normal conditions = 1.618 (balanced)
Stable markets = 0.786-1.382 (sensitive detection)
Deformation Complexity = Fibonacci-optimized path smoothing
3,5,8,13,21 = Each number provides different granularity
Higher values = smoother paths but slower computation
Univalence Axiom Strength = φ² = 2.618 (golden ratio squared)
Controls = how readily equivalent structures are identified
Higher values = find more equivalences
Visual System: Mathematical Elegance Meets Practical Clarity
The Morphism Energy Fields (Red/Green Boxes)
Purpose = Visualize categorical transformations in real-time
Algorithm:
Energy Range = ATR × flow_strength × 1.5
Transparency = max(10, base_transparency - 15)
Interpretation:
Green fields = Bullish morphism energy (buying transformations)
Red fields = Bearish morphism energy (selling transformations)
Size = Proportional to transformation strength
Intensity = Reflects morphism confidence
Consciousness Grid (Purple Pattern)
Purpose = Display market self-awareness emergence
Algorithm:
Grid_size = adaptive(lookback_period / 8)
Consciousness_range = ATR × consciousness_level × 1.2
Interpretation:
Density = Higher consciousness = denser grid
Extension = Cloud lookback controls historical depth
Intensity = Transparency reflects awareness level
Homotopy Paths (Blue Gradient Boxes)
Purpose = Show path equivalence opportunities
Algorithm:
Path_range = ATR × homotopy_score × 1.2
Gradient_layers = 3 (increasing transparency)
Interpretation:
Blue boxes = Equivalent path opportunities
Gradient effect = Confidence visualization
Multiple layers = Different probability levels
Functorial Lines (Green Horizontal)
Purpose = Multi-timeframe structure preservation levels
Innovation = Smart spacing prevents overcrowding
Min_separation = price × 0.001 (0.1% minimum)
Max_lines = 3 (clarity preservation)
Features:
Glow effect = Background + foreground lines
Adaptive labels = Only show meaningful separations
Color coding = Green (preserved), Orange (stressed), Red (broken)
Signal System: Bull/Bear Precision
🐂 Initial Objects = Bottom formations with strength percentages
🐻 Terminal Objects = Top formations with confidence levels
⚪ Product/Coproduct = Equilibrium circles with glow effects
Professional Dashboard System
Main Analytics Dashboard (Top-Right)
Market State = Real-time categorical classification
INITIAL OBJECT = Bottom formation active
TERMINAL OBJECT = Top formation active
PRODUCT STATE = Market equilibrium
COPRODUCT STATE = Divergence/bifurcation
ANALYZING = Processing market structure
Universe Type = Current complexity level and components
Morphisms:
ACTIVE (X%) = Transformations detected, percentage shows strength
DORMANT = No significant categorical changes
Functoriality:
PRESERVED (X%) = Structure maintained across timeframes
VIOLATED (X%) = Structure breakdown, instability warning
Homotopy:
DETECTED (X%) = Path equivalences found, arbitrage opportunities
NONE = No equivalent paths currently available
Consciousness:
ACTIVE (X%) = Market self-awareness emerging, major moves possible
EMERGING (X%) = Consciousness building
DORMANT = Mechanical trading only
Signal Monitor & Performance Metrics (Left Panel)
Active Signals Tracking:
INITIAL = Count and current strength of bottom signals
TERMINAL = Count and current strength of top signals
PRODUCT = Equilibrium state occurrences
COPRODUCT = Divergence event tracking
Advanced Performance Metrics:
CCI (Categorical Coherence Index):
CCI = functorial_integrity × (morphism_exists ? 1.0 : 0.5)
STRONG (>0.7) = High structural coherence
MODERATE (0.4-0.7) = Adequate coherence
WEAK (<0.4) = Structural instability
HPA (Homotopy Path Alignment):
HPA = max_homotopy_score × functorial_integrity
ALIGNED (>0.6) = Strong path equivalences
PARTIAL (0.3-0.6) = Some equivalences
WEAK (<0.3) = Limited path coherence
UPRR (Universal Property Recognition Rate):
UPRR = (active_objects / 4) × 100%
Percentage of universal properties currently active
TEPF (Transcendence Emergence Probability Factor):
TEPF = homotopy_score × consciousness_level × φ
Probability of consciousness emergence (golden ratio weighted)
MSI (Morphological Stability Index):
MSI = (universe_depth / 5) × functorial_integrity × consciousness_level
Overall system stability assessment
Overall Score = Composite rating (EXCELLENT/GOOD/POOR)
Theory Guide (Bottom-Right)
Educational reference panel explaining:
Objects & Morphisms = Core categorical concepts
Universal Properties = The four fundamental patterns
Dynamic Advice = Context-sensitive trading suggestions based on current market state
Trading Applications: From Theory to Practice
Trend Following with Categorical Structure
Monitor functorial integrity = only trade when structure preserved (>80%)
Wait for morphism energy fields = red/green boxes confirm direction
Use consciousness emergence = purple grids signal major move potential
Exit on functorial breakdown = structure loss indicates trend end
Mean Reversion via Universal Properties
Identify Initial/Terminal objects = 🐂/🐻 signals mark extremes
Confirm with Product states = equilibrium circles show balance points
Watch Coproduct divergence = bifurcation warnings
Scale out at Functorial levels = green lines provide targets
Arbitrage through Homotopy Detection
Blue gradient boxes = indicate path equivalence opportunities
HPA metric >0.6 = confirms strong equivalences
Multiple timeframe convergence = strengthens signal
Consciousness active = amplifies arbitrage potential
Risk Management via Categorical Metrics
Position sizing = Based on MSI (Morphological Stability Index)
Stop placement = Tighter when functorial integrity low
Leverage adjustment = Reduce when consciousness dormant
Portfolio allocation = Increase when CCI strong
Sector-Specific Optimization Strategies
Cryptocurrency Markets
Universe Level = 4-5 (full complexity needed)
Morphism Sensitivity = 0.382-0.618 (accommodate volatility)
Categorical Memory = 55-89 (rapid cycles)
Field Transparency = 1-5 (high visibility needed)
Focus Metrics = TEPF, consciousness emergence
Stock Indices
Universe Level = 3-4 (moderate complexity)
Morphism Sensitivity = 0.618-1.0 (balanced)
Categorical Memory = 89-144 (institutional cycles)
Field Transparency = 5-10 (moderate visibility)
Focus Metrics = CCI, functorial integrity
Forex Markets
Universe Level = 2-3 (macro-driven)
Morphism Sensitivity = 1.0-1.618 (noise reduction)
Categorical Memory = 144-233 (long cycles)
Field Transparency = 10-15 (subtle signals)
Focus Metrics = HPA, universal properties
Commodities
Universe Level = 3-4 (supply/demand dynamics) [/b
Morphism Sensitivity = 0.618-1.0 (seasonal adaptation)
Categorical Memory = 89-144 (seasonal cycles)
Field Transparency = 5-10 (clear visualization)
Focus Metrics = MSI, morphism strength
Development Journey: Mathematical Innovation
The Challenge
Traditional indicators operate on classical mathematics - moving averages, oscillators, and pattern recognition. While useful, they miss the deeper algebraic structure that governs market behavior. Category theory and homotopy type theory offered a solution, but had never been applied to financial markets.
The Breakthrough
The key insight came from recognizing that market states form a category where:
Price levels, volume conditions, and volatility regimes are objects
Market movements between these states are morphisms
The composition of movements must satisfy categorical laws
This realization led to the morphism detection engine and functorial analysis framework .
Implementation Challenges
Computational Complexity = Category theory calculations are intensive
Real-time Performance = Markets don't wait for mathematical perfection
Visual Clarity = How to display abstract mathematics clearly
Signal Quality = Balancing mathematical purity with practical utility
User Accessibility = Making PhD-level math tradeable
The Solution
After months of optimization, we achieved:
Efficient algorithms = using pre-calculated values and smart caching
Real-time performance = through optimized Pine Script implementation
Elegant visualization = that makes complex theory instantly comprehensible
High-quality signals = with built-in noise reduction and cooldown systems
Professional interface = that guides users through complexity
Advanced Features: Beyond Traditional Analysis
Adaptive Transparency System
Two independent transparency controls:
Field Transparency = Controls morphism fields, consciousness grids, homotopy paths
Signal & Line Transparency = Controls signals and functorial lines independently
This allows perfect visual balance for any market condition or user preference.
Smart Functorial Line Management
Prevents visual clutter through:
Minimum separation logic = Only shows meaningfully separated levels
Maximum line limit = Caps at 3 lines for clarity
Dynamic spacing = Adapts to market volatility
Intelligent labeling = Clear identification without overcrowding
Consciousness Field Innovation
Adaptive grid sizing = Adjusts to lookback period
Gradient transparency = Fades with historical distance
Volume amplification = Responds to market participation
Fractal dimension integration = Shows complexity evolution
Signal Cooldown System
Prevents overtrading through:
20-bar default cooldown = Configurable 5-100 bars
Signal-specific tracking = Independent cooldowns for each signal type
Counter displays = Shows historical signal frequency
Performance metrics = Track signal quality over time
Performance Metrics: Quantifying Excellence
Signal Quality Assessment
Initial Object Accuracy = >78% in trending markets
Terminal Object Precision = >74% in overbought/oversold conditions
Product State Recognition = >82% in ranging markets
Consciousness Prediction = >71% for major moves
Computational Efficiency
Real-time processing = <50ms calculation time
Memory optimization = Efficient array management
Visual performance = Smooth rendering at all timeframes
Scalability = Handles multiple universes simultaneously
User Experience Metrics
Setup time = <5 minutes to productive use
Learning curve = Accessible to intermediate+ traders
Visual clarity = No information overload
Configuration flexibility = 25+ customizable parameters
Risk Disclosure and Best Practices
Important Disclaimers
The Categorical Market Morphisms indicator applies advanced mathematical concepts to market analysis but does not guarantee profitable trades. Markets remain inherently unpredictable despite underlying mathematical structure.
Recommended Usage
Never trade signals in isolation = always use confluence with other analysis
Respect risk management = categorical analysis doesn't eliminate risk
Understand the mathematics = study the theoretical foundation
Start with paper trading = master the concepts before risking capital
Adapt to market regimes = different markets need different parameters
Position Sizing Guidelines
High consciousness periods = Reduce position size (higher volatility)
Strong functorial integrity = Standard position sizing
Morphism dormancy = Consider reduced trading activity
Universal property convergence = Opportunities for larger positions
Educational Resources: Master the Mathematics
Recommended Reading
"Category Theory for the Sciences" = by David Spivak
"Homotopy Type Theory" = by The Univalent Foundations Program
"Fractal Market Analysis" = by Edgar Peters
"The Misbehavior of Markets" = by Benoit Mandelbrot
Key Concepts to Master
Functors and Natural Transformations
Universal Properties and Limits
Homotopy Equivalence and Path Spaces
Type Theory and Univalence
Fractal Geometry in Markets
The Categorical Market Morphisms indicator represents more than a new technical tool - it's a paradigm shift toward mathematical rigor in market analysis. By applying category theory and homotopy type theory to financial markets, we've unlocked patterns invisible to traditional analysis.
This isn't just about better signals or prettier charts. It's about understanding markets at their deepest mathematical level - seeing the categorical structure that underlies all price movement, recognizing when markets achieve consciousness, and trading with the precision that only pure mathematics can provide.
Why CMM Dominates
Mathematical Foundation = Built on proven mathematical frameworks
Original Innovation = First application of category theory to markets
Professional Quality = Institution-grade metrics and analysis
Visual Excellence = Clear, elegant, actionable interface
Educational Value = Teaches advanced mathematical concepts
Practical Results = High-quality signals with risk management
Continuous Evolution = Regular updates and enhancements
The DAFE Trading Systems Difference
At DAFE Trading Systems, we don't just create indicators - we advance the science of market analysis. Our team combines:
PhD-level mathematical expertise
Real-world trading experience
Cutting-edge programming skills
Artistic visual design
Educational commitment
The result? Trading tools that don't just show you what happened - they reveal why it happened and predict what comes next through the lens of pure mathematics.
"In mathematics you don't understand things. You just get used to them." - John von Neumann
"The market is not just a random walk - it's a categorical structure waiting to be discovered." - DAFE Trading Systems
Trade with Mathematical Precision. Trade with Categorical Market Morphisms.
Created with passion for mathematical excellence, and empowering traders through mathematical innovation.
— Dskyz, Trade with insight. Trade with anticipation.
Kijun Shifting Band Oscillator | QuantMAC🎯 Kijun Shifting Band Oscillator | QuantMAC
📊 **Revolutionary Technical Analysis Tool Combining Ancient Ichimoku Wisdom with Cutting-Edge Statistical Methods**
🌟 Overview
The Kijun Shifting Band Oscillator represents a sophisticated fusion of traditional Japanese technical analysis and modern statistical theory. Built upon the foundational concepts of the Ichimoku Kinko Hyo system, this indicator transforms the classic Kijun-sen (base line) into a dynamic, multi-dimensional analysis tool that provides traders with unprecedented market insights.
This advanced oscillator doesn't just show you where price has been – it reveals the underlying momentum dynamics and volatility patterns that drive market movements, giving you a statistical edge in your trading decisions.
🔥 Key Features & Innovations
Dual Trading Modes for Maximum Flexibility: 🚀
Long/Short Mode: Full bidirectional trading capability for aggressive traders seeking to capitalize on both bullish and bearish market conditions
Long/Cash Mode: Conservative approach perfect for risk-averse traders, taking long positions during uptrends and moving to cash during downtrends (avoiding short exposure)
Advanced Visual Intelligence: 🎨
9 Professional Color Schemes: From classic blue/navy to vibrant orange/purple combinations, each optimized for different chart backgrounds and personal preferences
Dynamic Gradient Histogram: Color intensity reflects oscillator strength, providing instant visual feedback on momentum magnitude
Intelligent Overlay Bands: Semi-transparent fills create clear visual boundaries without cluttering your chart
Smart Candle Coloring: Real-time color changes reflect current market state and trend direction
Customizable Threshold Lines: Clearly marked entry and exit levels with contrasting colors
Professional-Grade Analytics: 📊
Real-Time Performance Metrics: Live calculation of 9 key performance indicators
Risk-Adjusted Returns: Sharpe, Sortino, and Omega ratios for comprehensive performance evaluation
Position Sizing Guidance: Half-Kelly percentage for optimal risk management
Drawdown Analysis: Maximum drawdown tracking for risk assessment
📈 Deep Technical Foundation
Kijun-Based Mathematical Framework: 🧮
The indicator begins with the traditional Kijun-sen calculation but extends it significantly:
Statistical Enhancements: 📉
Adaptive Volatility: Bands expand and contract based on market volatility
Momentum Filtering: EMA smoothing of oscillator for trend confirmation
State Management: Intelligent signal filtering prevents whipsaws and false signals
Multi-Timeframe Compatibility: Optimized algorithms work across all timeframes
⚙️ Comprehensive Parameter Control
Kijun Core Settings: 🎛️
Kijun Length (Default: 30): Controls the lookback period for the base calculation. Shorter periods = more responsive, longer periods = smoother signals
Source Selection: Choose from Close, Open, High, Low, or HL2. Close price recommended for most applications
Calculation Method: Uses traditional Ichimoku methodology ensuring compatibility with classic analysis
Advanced Oscillator Configuration: 📊
Standard Deviation Length (Default: 36): Determines volatility measurement period. Affects band width and sensitivity
SD Multiplier (Default: 2.1): Fine-tune band distance from basis line. Higher values = wider bands, lower values = tighter bands
Oscillator Multiplier (Default: 100): Scales the final oscillator output. Useful for matching other indicators or personal preference
Smoothing Algorithm: Built-in EMA smoothing prevents noise while maintaining responsiveness
Signal Threshold Optimization: 🎯
Long Threshold (Default: 83): Oscillator level that triggers long entries. Higher values = fewer but stronger signals
Short Threshold (Default: 42): Oscillator level that triggers short entries. Lower values = fewer but stronger signals
Threshold Logic: Crossover-based system with state management prevents signal overlap
Customization Range: Fully adjustable to match your trading style and risk tolerance
Precision Date Control: 📅
Start Date/Month/Year: Precise backtesting control down to the day
Historical Analysis: Test strategies on specific market periods or events
Strategy Validation: Isolate performance during different market conditions
📊 Professional Metrics Dashboard
Risk Assessment Metrics: 💼
Maximum Drawdown %: Largest peak-to-trough decline in portfolio value. Critical for understanding worst-case scenarios and position sizing
Sortino Ratio: Risk-adjusted return measure focusing only on downside volatility. Superior to Sharpe ratio for asymmetric return distributions
Sharpe Ratio: Classic risk-adjusted performance metric. Values above 1.0 considered good, above 2.0 excellent
Omega Ratio: Probability-weighted ratio capturing all moments of return distribution. More comprehensive than Sharpe or Sortino
Performance Analytics: 📈
Profit Factor: Gross Profit ÷ Gross Loss. Values above 1.0 indicate profitability, above 2.0 considered excellent
Win Rate %: Percentage of profitable trades. Consider alongside average win/loss size for complete picture
Net Profit %: Total return on initial capital. Accounts for compounding effects
Total Trades: Sample size for statistical significance assessment
Advanced Position Sizing: 🎯
Half Kelly %: Optimal position size based on Kelly Criterion, reduced by 50% for safety margin
Risk Management: Helps determine appropriate position size relative to account equity
Mathematical Foundation: Based on win probability and profit factor calculations
Practical Application: Directly usable percentage for position sizing decisions
🎨 Advanced Display Options
Flexible Interface Design: 🖥️
6 Positioning Options: Top/Bottom/Middle × Left/Right combinations for optimal chart organization
Toggle Functionality: Show/hide metrics table for clean chart presentation during analysis
Color Coordination: Metrics table colors match selected oscillator color scheme
Professional Styling: Clean, readable format with proper spacing and alignment
Visual Hierarchy: 🎭
Oscillator Histogram: Primary focus with gradient intensity showing momentum strength
Threshold Lines: Clear horizontal references for entry/exit levels
Zero Line: Neutral reference point for trend bias determination
Background Bands: Subtle overlay context without chart clutter
🚀 Advanced Signal Generation System
Multi-Layer Signal Logic: ⚡
Primary Signal Generation: Oscillator crossover above Long Threshold (default 83) triggers long entries
Exit Signal Processing: Oscillator crossunder below Short Threshold (default 42) triggers position exits
State Management System: Prevents duplicate signals and ensures clean position transitions
Mode-Specific Logic: Different behavior for Long/Short vs Long/Cash modes
Date Range Filtering: Signals only generated within specified backtesting period
Confirmation Requirements: Bar confirmation prevents false signals from intrabar price spikes
Intelligent Position Management: 🧠
Entry Tracking: Precise entry price recording for accurate P&L calculations
Position State Monitoring: Continuous tracking of long/short/cash positions
Automatic Exit Logic: Seamless position closure and new position initiation
Performance Calculation: Real-time P&L tracking with compounding effects
📉📈 Comprehensive Band Interpretation Guide
Dynamic Band Analysis: 🔍
Upper Band Function: Represents dynamic resistance based on recent volatility. Price approaching upper band suggests potential reversal or breakout
Lower Band Function: Represents dynamic support with volatility adjustment. Price near lower band indicates oversold conditions or support testing
Middle Line (Basis): Trend direction indicator. Price above = bullish bias, price below = bearish bias
Band Width Interpretation: Wide bands = high volatility, narrow bands = low volatility/potential breakout setup
Band Slope Analysis: Rising bands = strengthening trend, falling bands = weakening trend
Oscillator Interpretation: 📊
Values Above 50: Price in upper half of recent range, bullish momentum
Values Below 50: Price in lower half of recent range, bearish momentum
Extreme Values (>80 or <20): Overbought/oversold conditions, potential reversal zones
Momentum Divergence: Oscillator direction vs price direction for early reversal signals
Trend Confirmation: Oscillator direction confirming or contradicting price trends
💡 Strategic Trading Applications
Primary Trading Strategies: 🎯
Trend Following: Use threshold crossovers to capture major directional moves. Best in trending markets with clear directional bias
Mean Reversion: Identify extreme oscillator readings for counter-trend opportunities. Effective in range-bound markets
Breakout Trading: Monitor band compressions followed by expansions for breakout signals
Swing Trading: Combine oscillator signals with band interactions for swing position entries/exits
Risk Management: Use metrics dashboard for position sizing and risk assessment
Market Condition Optimization: 🌊
Trending Markets: Increase threshold separation for fewer, stronger signals
Choppy Markets: Decrease threshold separation for more responsive signals
High Volatility: Increase SD multiplier for wider bands
Low Volatility: Decrease SD multiplier for tighter bands and earlier signals
⚙️ Advanced Configuration Tips
Parameter Optimization Guidelines: 🔧
Kijun Length Adjustment: Shorter periods (10-20) for faster signals, longer periods (50-100) for smoother trends
SD Length Tuning: Match to your trading timeframe - shorter for responsive, longer for stability
Threshold Calibration: Backtest different levels to find optimal entry/exit points for your market
Color Scheme Selection: Choose schemes that provide best contrast with your chart background and other indicators
Integration with Other Indicators: 🔗
Volume Indicators: Confirm oscillator signals with volume spikes
Support/Resistance: Use key levels to filter oscillator signals
Momentum Indicators: RSI, MACD confirmation for signal strength
Trend Indicators: Moving averages for overall trend bias confirmation
⚠️ Important Usage Notes & Limitations
Indicator Characteristics: ⚡
Lagging Nature: Based on historical price data - signals occur after moves have begun
Best Practice: Combine with leading indicators and price action analysis
Market Dependency: Performance varies across different market conditions and instruments
Backtesting Essential: Always validate parameters on historical data before live implementation
Optimization Recommendations: 🎯
Parameter Testing: Systematically test different combinations on your preferred instruments
Walk-Forward Analysis: Regularly re-optimize parameters to maintain effectiveness
Market Regime Awareness: Adjust parameters for different market conditions (trending vs ranging)
Risk Controls: Implement maximum drawdown limits and position size controls
🔧 Technical Specifications
Performance Optimization: ⚡
Efficient Algorithms: Optimized calculations for smooth real-time operation
Memory Management: Smart array handling for metrics calculations
Visual Optimization: Balanced detail vs performance for responsive charts
Multi-Symbol Ready: Consistent performance across different assets
---
The Kijun Shifting Band Oscillator represents the evolution of technical analysis, bridging the gap between traditional methods and modern quantitative approaches. This indicator provides traders with a comprehensive toolkit for market analysis, combining the intuitive wisdom of Japanese candlestick analysis with the precision of statistical mathematics.
🎯 Designed for serious traders who demand professional-grade analysis tools with institutional-quality metrics and risk management capabilities. Whether you're a discretionary trader seeking visual confirmation or a systematic trader building quantitative strategies, this indicator provides the foundation for informed trading decisions.
⚠️ IMPORTANT DISCLAIMER
Past Performance Warning: 📉⚠️
PAST PERFORMANCE IS NOT INDICATIVE OF FUTURE RESULTS. Historical backtesting results, while useful for strategy development and parameter optimization, do not guarantee similar performance in live trading conditions. Market conditions change continuously, and what worked in the past may not work in the future.
Remember: Successful trading requires discipline, continuous learning, and adaptation to changing market conditions. No indicator or strategy guarantees profits, and all trading involves substantial risk of loss.
mrD Open InterestIntroduction
"mrD Open Interest" is a technical analysis reference tool that can help investors monitor and analyze Open Interest data from various cryptocurrency exchanges. This indicator provides insights into Open Interest data through patterns, bursts, and money flow based on proprietary algorithms.
Important Note
Trading always involves risk and can lead to capital loss. This indicator should only be used as a supplementary tool in technical analysis and should not be considered as an accurate forecasting tool or the sole basis for trading decisions. Past results do not guarantee future results.
Proprietary Features of the Indicator
"mrD Open Interest" has been developed with several proprietary features, qualifying it for source code protection when published:
- Unique Multi-Source Integration Algorithm: The indicator uses a smart aggregation method to combine OI data from multiple exchanges, creating a holistic view of market pressure that is not dependent on a single exchange. This method employs special weighting and noise filtering to ensure the aggregated data accurately reflects market conditions.
- Proprietary OI-Price Correlation Analysis Algorithm: Unlike traditional OI indicators that simply display OI values, this indicator uses a complex algorithm to analyze the correlation between price movements and OI changes. This algorithm automatically identifies four money flow patterns (Buy Inflow, Sell Inflow, Buy Outflow, Sell Outflow) and ranks them by potential market impact.
- Advanced Burst Detection Technology: The proprietary algorithm identifies "bursts" - sudden changes in OI that can lead to significant market volatility. This system relies not only on absolute change but also analyzes the rate of change, amplitude, and correlation with historical peaks/troughs to determine the significance of a burst.
- Integrated Smart Alert System: The indicator features a smart alert algorithm, only sending notifications when patterns with high statistical significance are detected, reducing "alert noise" and helping users focus on the most potential opportunities.
- Visual Representation Technology: The user interface design uses proprietary visual representation technology, allowing users to easily identify important patterns and signals through a special system of colors, icons, and display formats.
Features That May Assist
1. Reference Data from Multiple Exchanges: The indicator can collect Open Interest information from various exchanges (Binance, BitMEX, Kraken) and different currency pairs (USDT, USD, BUSD), potentially providing investors with more information about the market.
2.Money Flow Pattern Analysis: The indicator suggests 4 patterns that may help identify market conditions:
Buy Inflow: Potential opening of new long positions (price up, OI up)
Buy Outflow: Potential closing of long positions (price down, OI down)
Sell Inflow: Potential opening of new short positions (price down, OI up)
Sell Outflow: Potential closing of short positions (price up, OI down)
Burst Identification: The indicator attempts to detect "bursts" - notable changes in Open Interest that may signal changes in money flow. Bursts are divided into two types: Up Burst and Down Burst.
3. Price-OI Correlation Reference: The tool provides information about the relationship between price movement and OI changes, potentially helping to assess whether current price momentum is supported by new money flow.
4. Diverse Display Modes: The indicator offers 3 display modes (Columns, Candles, Columns, and Price Line) that may suit different analytical approaches.
Setup and Usage Guide
1. Basic Setup
Select Data Sources (Exchange Settings):
By default, the indicator uses data from Binance USDT Perpetual.
Depending on the coin pair and exchange you're interested in, you can enable/disable different data sources (Binance USD, BUSD, BitMEX USD, USDT, or Kraken).
Recommendation: For popular coins like BTC or ETH, consider combining data from 2-3 major exchanges for a more comprehensive view.
2. Display Customization (Visuals Settings):
OI Display Type: Choose a display type that suits your analysis style:
"Columns": Column format, making it easy to identify OI changes.
"Candles": Candle format, similar to price charts, helps identify candlestick patterns in OI.
"Columns and Price Line": Combines OI columns and price line, helping directly compare OI with price movements.
Show background: Enable to highlight burst periods with a colored background (recommended when using candle mode).
Show signals: Enable to display of burst indicators on the chart (recommended to keep enabled).
Text Color: Customize text color to match your chart background.
3. Alert Settings:
hoose alert types that suit your trading strategy:
"Inflows Only": Only alerts when new money flows into the market.
"Outflows Only": Only alerts when money flows out of the market.
"Bursts Only": Only alerts when there's a strong burst in OI.
"All": Alerts for all the above events.
Effective Usage
Trend Analysis Based on Money Flow Patterns:
Buy Inflow (Green): When the price increases along with OI, it may indicate new buying pressure. Can be considered as a supportive signal for an uptrend.
Sell Inflow (Red): When price decreases along with increasing OI, it may indicate new selling pressure. Can be considered as a supportive signal for a downtrend.
Buy Outflow (Teal): When price decreases but OI also decreases, it may indicate taking profit/cutting loss from long positions. Usually not strong selling pressure and may be ending soon.
Sell Outflow (Dark Red): When the price increases but OI decreases, it may indicate closing of short positions. Usually not strong buying pressure and may be ending soon.
Burst Analysis:
Up Burst: Strong and positive change in OI, most notable when occurring in a Buy Inflow pattern, may signal strong buying money flow into the market.
Down Burst: Strong and negative change in OI, most notable when occurring in a Sell Inflow pattern, may signal strong selling money flow into the market.
Bursts are often signals that deserve special attention and may indicate strong changes in market sentiment.
Using the Information Table:
Monitor "Aggregated OI" to capture the total amount of open contracts.
Pay attention to "OI Change (%)" to assess the degree of change compared to the previous candle.
"Relative OI" provides information about the relative level of OI compared to the average.
"Flow Type" indicates the current money flow pattern.
"Burst Status" displays the burst status if any.
Combining with Other Indicators:
Use in combination with trend indicators (MA, MACD) to confirm trends.
Combine with volume indicators for a more comprehensive view of market activity.
Reference additional momentum indicators to assess trend strength.
Customizing According to Timeframe:
Short timeframes (1m-15m): May show more noise signals.
Medium timeframes (30m-4h): Often provide a good balance between sensitivity and noise filtering.
Long timeframes (D-W): Suitable for monitoring long-term OI trends.
Flux Charts - SFX Screener💎 GENERAL OVERVIEW
The SFX Screener by Flux Charts is a multi-timeframe market scanner that extracts and visually organizes key conditions detected by the SFX Algo indicator across multiple assets in real-time. It does not perform independent analysis or generate new signals—instead, it pulls data directly from the SFX Algo’s calculations to ensure full alignment across different timeframes and tickers.
The SFX Algo is a multi-factor trading indicator that integrates trend analysis, signal generation, market overlays, and take-profit/stop-loss levels into a single system. It evaluates multiple trend components, including EMA direction, momentum shifts, and volatility cycles, to determine market conditions. Signal generation is based on an Adjusted Weighted Majority Algorithm, filtering out weaker signals by prioritizing the most reliable market indicators. Market overlays, such as Volatility Bands and the Retracement Wave, provide dynamic support, resistance, exit points, and entry points. Its adaptable structure allows traders to customize settings based on strategy preferences, making it effective for scalping, swing trading, and long-term trend analysis.
The SFX Screener’s purpose is to give traders a dashboard view of these SFX Algo signals across multiple tickers and timeframes in real-time.
📌 HOW DOES IT WORK ?
The SFX Algo indicator employs an Adjusted Weighted Majority algorithm to generate "buy" and "sell" signals. It evaluates multiple market indicators ("experts"), including momentum, ATR trends, and EMA trends, and assigns weights based on their recent performance. The "Time Weighting" setting allows users to balance between using more historical data or prioritizing recent trends. Unlike traditional weighted majority methods, SFX also dynamically penalizes larger losses. Signals are confirmed based on the consensus of the most successful indicators within the selected time period, filtering out weaker signals during underperforming phases.
The SFX Screener extracts these calculated outputs and visually organizes them into a real-time dashboard. Each signal, status, and volatility condition displayed in the screener is a direct output from the SFX Algo indicator.
🚩 UNIQUENESS
Unlike traditional screeners that rely on preset filters or static conditions, the SFX Screener dynamically updates its dashboard based on live outputs from the SFX Algo’s adaptive algorithm.
Traditional Screeners → Use predefined filters like “price above EMA” or “RSI overbought.” They do not adjust to market dynamics.
SFX Screener → Displays outputs directly from an adaptive algorithm that continuously evaluates trends, volatility, and momentum changes.
The SFX Screener can show SFX Algo's status on 8 different tickers on different timeframes. Key factors that make it unique include:
✅ Real-time sync with SFX Algo → Displays live conditions, not static filters.
✅ Comprehensive Dashboard – This screener provides a complete and customizable dashboard designed to enhance traders' decision-making by consolidating crucial SFX Algo insights into one user-friendly interface.
✅ Multi-Ticker & Multi-Timeframe Analysis – With support for up to 8 tickers and timeframes, traders can effortlessly analyze the bigger market picture, identifying trends and opportunities across different assets and timeframes.
By combining multiple analytical elements in a single view, this screener empowers traders with the insights needed to navigate the market more effectively.
🎯 SFX SCREENER FEATURES:
SFX Algo Signals : This tool can detect SFX Algo signals across different tickers & timeframes.
Volatility Bands : Detection of Volatility Bands Status & Retests.
Retracement Wave : Detection of Retracement Wave Status & Retests.
Highly Configurable : Offers multiple parameters for fine-tuning detection settings.
Up to 8 Tickers : Allows traders to analyze multiple tickers & timeframes simultaneously for enhanced accuracy.
📊 SFX SCREENER DATA BREAKDOWN
Signal ->
Buy -> The latest signal is a buy signal.
Sell -> The latest signal is a sell signal.
The rating of the signal is shown after the signal type.
Δ⭐ ->
Shows the rating change (delta) after the signal is triggered. Positive values mean that the rating is increased after the signal is given, negative values mean that it's decreased.
Status ->
Displays the amount of time passed after the signal is given.
TP Targets ->
Shows the Take-Profit targets of the signal, if a target was achieved, there is a ✅ symbol near it and the next target it displayed.
V. Bands ->
The Volatility Bands dynamically adjust to market conditions, expanding during high volatility and contracting during low volatility. When the volatility bands are tight, or the upper and lower bands are close to each other, the market is not volatile. During periods of low volatility, it’s common for price to consolidate or move sideways. An early indication of a large price move can occur when the bands widen or open up after being tight. When the volatility bands are wide, it reflects a period of increased volatility, typically during strong price trends or after a breakout. The volatility bands can also act as support and resistance areas. The upper band acts as resistance while the lower band acts as support. These mark out good areas for potential reversals. Breakouts can also occur when price moves beyond the bands, signaling a potential trend in the breakout direction.
Outside -> The price is currently outside of the Volatility Bands.
Inside | Upper -> The price is currently inside the Upper Volatility Band.
Inside | Lower -> The price is currently inside the Lower Volatility Band.
R. Wave ->
The Retracement Wave is used to identify entry points during pullbacks in trending markets. It can also be used to find exit points for open trades. The wave is bullish when price is above it and bearish when the price is below it. The retracement wave can be used as an area to enter during a pullback in a trending market. The wave can also be helpful for managing risk and closing out positions.
Outside | Bullish -> The Retracement Wave is currently Bullish, and the price is outside of it.
Outside | Bearish -> The Retracement Wave is currently Bearish, and the price is outside of it.
Inside | Bullish -> The Retracement Wave is currently Bullish, and the price is inside of it.
Inside | Bearish -> The Retracement Wave is currently Bearish, and the price is inside of it.
Profit & Loss (P&L) ->
Shows the amount of profit or loss the position is currently in. All values are shown in terms of percentage, and positive values mean the position is in profit while negative values mean that the position is in loss.
⚠ Timeframe Restriction : The selected timeframes for analysis cannot be lower than the chart’s current timeframe to ensure proper data alignment.
⏰ ALERTS
This screener supports alerts, so you never miss a key market move. You can choose to receive alerts when a buy or sell signal is given, helping you spot potential trading opportunities. Additionally, you can enable alerts for take-profit or stop-loss levels, which notify you when the price achieves those levels. The alerts will work for each enabled ticker in the settings. You can also toggle webhook format for alerts, and choose to include ticker metadata in it.
⚙️ SETTINGS
1. Algorithm Settings
Sensitivity: The sensitivity setting is a key parameter that influences the frequency of signals the SFX Algo generates. By adjusting this parameter, you can control the frequency of signals produced by the algorithm. Using a lower sensitivity setting generates more frequent signals that are highly responsive to minor price fluctuations. Using a higher sensitivity setting reduces the frequency of signals, focusing on more significant price movements and filtering out minor fluctuations.
Signal Strength: The Signal Strength setting filters signals based on their quality, allowing traders to focus on the most reliable opportunities. This feature helps traders balance the quantity and reliability of the algorithm’s signals to suit their trading strategy. Using a lower signal strength will display more signals, including those with lower signal ratings, for broader market coverage. Using a higher signal strength will display fewer signals by prioritizing those with higher signal ratings, reducing market noise.
Time Weighting: The Time Weighting setting in the SFX Algo determines how historical market data is analyzed to generate signals.
a) Recent Trends
Focuses on the most recent movements for short-term analysis. This setting is good for scalpers and intraday traders who need to react quickly to market changes.
b) Mixed Trends
Balances recent and historical price movements for a comprehensive market view. This setting is well-suited for swing traders and those who want to capture medium-term opportunities by combining the benefits of short-term responsiveness with the reliability of long-term trends.
c) Long-term Trends
Relies on extended historical market data to identify broader market trends, making it an excellent choice for traders focused on long-term strategies.
Minimum Star Rating : The Minimum Star Rating setting allows you to filter signals based on their strength, showing only those that meet or exceed your chosen threshold. For instance, setting the minimum star rating to 3 ensures you only receive signals with a rating of 3 stars or higher.
2. Take Profit / Stop Loss Methods
Key Levels
The Key Levels method uses pivot points to set take profit and stop-loss levels. The TP and SL levels are shown when a new signal is generated.
Volatility Bands
This TP/SL method uses the Volatility Bands overlay to set dynamic TP and SL levels. These levels are not predetermined so they will not be shown in advance when a signal is generated.
Signal Rating
Sets take profit and stop-loss levels based on changes in a signal's rating strength. These levels are not predetermined so they will not be shown in advance when a signal is generated.
Auto Stop-Loss
The auto method can only be applied to the SL. The auto method allows the algorithm to detect SL automatically when a momentum shift is detected. You can adjust the risk tolerance of the Auto SL by adjusting the ‘Auto Risk Tolerance’ setting. You can choose between Low, Medium, and High. A high-risk tolerance will result in stop losses being triggered less often.
3. Tickers
You can set, then enable or disable up to 8 tickers in this section to get informed about their latest SFX Algo signal.
‼️ Important Notes
TradingView has limitations when running advanced screeners, resulting in the following restrictions:
Computation Errors:
The computation of using MTF features and viewing several tickers is very intensive on TradingView. This can sometimes cause calculation timeouts. When this occurs simply force the recalculation by modifying one indicator’s settings or by removing the indicator and adding it to your chart again.
Inconsistencies:
You may notice inconsistencies when viewing the screener on a chart with a specific symbol because screener tickers originate from different markets. Since the cryptocurrency market operates 24/7, while stock markets have defined opening and closing hours, the screener may return varying information depending on whether you're currently viewing a cryptocurrency, stock, or currency pair.
Goertzel Adaptive JMA T3Hello Fellas,
The Goertzel Adaptive JMA T3 is a powerful indicator that combines my own created Goertzel adaptive length with Jurik and T3 Moving Averages. The primary intention of the indicator is to demonstrate the new adaptive length algorithm by applying it on bleeding-edge MAs.
It is useable like any moving average, and the new Goertzel adaptive length algorithm can be used to make own indicators Goertzel adaptive.
Used Adaptive Length Algorithms
Normalized Goertzel Power: This uses the normalized power of the Goertzel algorithm to compute an adaptive length without the special operations, like detrending, Ehlers uses for his DFT adaptive length.
Ehlers Mod: This uses the Goertzel algorithm instead of the DFT, originally used by Ehlers, to compute a modified version of his original approach, which sticks as close as possible to the original approach.
Scoring System
The scoring system determines if bars are red or green and collects them.
Then, it goes through all collected red and green bars and checks how big they are and if they are above or below the selected MA. It is positive when green bars are under MA or when red bars are above MA.
Then, it accumulates the size for all positive green bars and for all positive red bars. The same happens for negative green and red bars.
Finally, it calculates the score by ((positiveGreenBars + positiveRedBars) / (negativeGreenBars + negativeRedBars)) * 100 with the scale 0–100.
Signals
Is the price above MA? -> bullish market
Is the price below MA? -> bearish market
Usage
Adjust the settings to reach the highest score, and enjoy an outstanding adaptive MA.
It should be useable on all timeframes. It is recommended to use the indicator on the timeframe where you can get the highest score.
Now, follows a bunch of knowledge for people who don't know about the concepts used here.
T3
The T3 moving average, short for "Tim Tillson's Triple Exponential Moving Average," is a technical indicator used in financial markets and technical analysis to smooth out price data over a specific period. It was developed by Tim Tillson, a software project manager at Hewlett-Packard, with expertise in Mathematics and Computer Science.
The T3 moving average is an enhancement of the traditional Exponential Moving Average (EMA) and aims to overcome some of its limitations. The primary goal of the T3 moving average is to provide a smoother representation of price trends while minimizing lag compared to other moving averages like Simple Moving Average (SMA), Weighted Moving Average (WMA), or EMA.
To compute the T3 moving average, it involves a triple smoothing process using exponential moving averages. Here's how it works:
Calculate the first exponential moving average (EMA1) of the price data over a specific period 'n.'
Calculate the second exponential moving average (EMA2) of EMA1 using the same period 'n.'
Calculate the third exponential moving average (EMA3) of EMA2 using the same period 'n.'
The formula for the T3 moving average is as follows:
T3 = 3 * (EMA1) - 3 * (EMA2) + (EMA3)
By applying this triple smoothing process, the T3 moving average is intended to offer reduced noise and improved responsiveness to price trends. It achieves this by incorporating multiple time frames of the exponential moving averages, resulting in a more accurate representation of the underlying price action.
JMA
The Jurik Moving Average (JMA) is a technical indicator used in trading to predict price direction. Developed by Mark Jurik, it’s a type of weighted moving average that gives more weight to recent market data rather than past historical data.
JMA is known for its superior noise elimination. It’s a causal, nonlinear, and adaptive filter, meaning it responds to changes in price action without introducing unnecessary lag. This makes JMA a world-class moving average that tracks and smooths price charts or any market-related time series with surprising agility.
In comparison to other moving averages, such as the Exponential Moving Average (EMA), JMA is known to track fast price movement more accurately. This allows traders to apply their strategies to a more accurate picture of price action.
Goertzel Algorithm
The Goertzel algorithm is a technique in digital signal processing (DSP) for efficient evaluation of individual terms of the Discrete Fourier Transform (DFT). It's particularly useful when you need to compute a small number of selected frequency components. Unlike direct DFT calculations, the Goertzel algorithm applies a single real-valued coefficient at each iteration, using real-valued arithmetic for real-valued input sequences. This makes it more numerically efficient when computing a small number of selected frequency components¹.
Discrete Fourier Transform
The Discrete Fourier Transform (DFT) is a mathematical technique used in signal processing to convert a finite sequence of equally-spaced samples of a function into a same-length sequence of equally-spaced samples of the discrete-time Fourier transform (DTFT), which is a complex-valued function of frequency . The DFT provides a frequency domain representation of the original input sequence .
Usage of DFT/Goertzel In Adaptive Length Algorithms
Adaptive length algorithms are automated trading systems that can dynamically adjust their parameters in response to real-time market data. This adaptability enables them to optimize their trading strategies as market conditions fluctuate. Both the Goertzel algorithm and DFT can be used in these algorithms to analyze market data and detect cycles or patterns, which can then be used to adjust the parameters of the trading strategy.
The Goertzel algorithm is more efficient than the DFT when you need to compute a small number of selected frequency components. However, for covering a full spectrum, the Goertzel algorithm has a higher order of complexity than fast Fourier transform (FFT) algorithms.
I hope this can help you somehow.
Thanks for reading, and keep it up.
Best regards,
simwai
---
Credits to:
@ClassicScott
@yatrader2
@cheatcountry
@loxx
All Support and Resistance Levels [PINESCRIPTLABS]First, we observe the Light Blue Macro Supports and the Pink Macro Resistances. These channels are automatically formed based on market data, identifying pivot points in price history and determining the strength of these levels based on the number of pivot points within these same channels. When the price interacts with the macro Supports, we have a strong reaction that we can take advantage of in two ways:
1. The first and most common, as we can see in the chart, is that these zones elicit a strong reaction, and the price respects the channel. For us, as traders, it signifies a pivot point where we can initiate a trade, either a buy at the macro Support or a sell at the macro Resistance.
2. The second way to use them, for which this algorithm is also prepared, is in case a movement occurs where the price breaks these Macro Supports or Macro Resistances. We have a special alert that will notify us because when these macro channels are broken, they tend to do so violently in a move that we can also capitalize on. Usually, when such a breakout occurs, we will visit the next support or resistance channel, which can bring us significant benefits.
The following complex and highly accurate calculation provided by this indicator allows us to work with price supports and resistances within the internal structure of macro channels. As we can see in the chart, "boxes" are formed that represent the detected support and resistance areas. It also detects breakouts when the price crosses below the support "box" or above the resistance "box" and displays labels on the chart indicating when the breakout occurred, all in real-time. But here comes something very special: the algorithm also has a calculation that, as we see in the chart, there are occasions when the breakout occurs, but the price returns to the support or resistance "box" and is detected. At this moment, a label appears on the chart indicating a possible confirmation of the breakout. In other words, as the price initially broke out but returned to the "box," the algorithm will notify us with another label and a special alert when the price confirms the breakout.
At the same time, we can see in the chart that the algorithm also provides us with a volume profile that allows us to see where the most trading activity has concentrated based on price levels. We can also use it to identify support and resistance levels based on the point of control (POC) and value area levels. As we can see in the chart, there are labels with the exact price where the highest volume was traded. The top label in the chart shows the highest price, and the last label we see is for the lowest price. These displayed labels are within the defined range of retrocession or Lookback Length, which we can configure in our indicator. As we observe, the algorithm shows a strong confluence between the Macro Support channels and the volume profile labels, confirming the strongest areas of the range.
Finally, after calculating supports and resistances from three different perspectives, the algorithm provides us with a macro view of the price in the form of trend lines. In other words, it shows us supports and resistances in the form of diagonal channels where we can see trends in the market and areas where the price has historically encountered difficulties in advancing or retreating, which we can corroborate with the supports and resistances mentioned at the beginning.
As we can see in the chart, the algorithm also shows us labels with the exact price where angular price supports and resistances are located. These calculations are very important as they provide a trend perspective, and we can get an idea of where the price is headed, combining these with the other support and resistance calculations.
Remember that all the previous calculations have their own alerts for when supports or resistances are broken, or in the case of new channels being created, also when there is a breakout of a box or a confirmation of a breakout.
The second type of alert from the indicator is configured to make our indicators work for us without the need to be present on the chart, thanks to special programming within the indicator's code. It will execute automatic buys and sells on our preferred exchange through an alert configured for the 3Commas bot. All you need to do is input your Bot ID, provided by 3Commas, into the alert. All premium indicators come with a configuration explanation that will guide you in detail on where to input your Bot ID.
ESPAÑOL:
En primer lugar, observamos los Macro Soportes en color azul claro y las Macro Resistencias en color rosa. Estos canales se forman automáticamente en función de los datos del mercado, identificando puntos de pivote en el historial de precios y determinando la fuerza de estos niveles según la cantidad de puntos de pivote dentro de estos mismos canales. Cuando el precio interactúa con los macro Soportes, tenemos una fuerte reacción que podemos aprovechar de dos formas:
1. La primera y más común, como observamos en el gráfico, es que estas zonas provocan una fuerte reacción, y el precio respeta el canal. Para nosotros, como traders, significa un punto de pivote donde podemos generar una entrada, ya sea de compra en el macro soporte o de venta en la macro resistencia.
2. La segunda forma de utilizarlos, para la cual este algoritmo también está preparado, es en caso de que se genere un movimiento en el que el precio rompa estos Macro Soportes o Macro Resistencias. Contamos con una alerta especial que nos avisará, ya que al romperse estos macro canales suelen hacerlo con violencia en un movimiento que también podemos aprovechar. Regularmente, cuando existe este rompimiento, visitaremos el siguiente canal de soporte o resistencia, lo que nos puede traer grandes beneficios.
El siguiente cálculo complejo y muy preciso que nos ofrece este indicador nos permite trabajar con soportes y resistencias del precio dentro de la estructura interna de los canales macro. Como observamos en el gráfico, se producen "boxes" que representan las áreas de soporte y resistencia detectadas. Además, detecta breakouts cuando el precio cruza por debajo del "box" de soporte o por encima del "box" de resistencia y muestra etiquetas en el gráfico que nos indican cuándo ocurrió el breakout, todo esto en tiempo real. Pero aquí viene algo super especial: el algoritmo también tiene un cálculo que, como vemos en el gráfico, hay ocasiones en las que el breakout ocurre, pero el precio retorna al "box" de soporte o resistencia y es detectado. En este momento, aparece una etiqueta en el gráfico que nos muestra que estamos ante una posible confirmación del breakout. Es decir, como el precio había hecho en primer lugar el breakout pero regresó al "box", el algoritmo nos avisará con otra etiqueta y alerta especial cuando el precio confirme el breakout.
Al mismo tiempo, observamos en el gráfico que el algoritmo también nos muestra un perfil de volumen que nos permite ver dónde se ha concentrado la mayor actividad de negociación en función de los niveles de precios. También podemos usarlo para identificar niveles de soporte y resistencia basados en el punto de control (POC) y los niveles de valor (Value Area). Como vemos en el gráfico, tenemos etiquetas con el precio exacto donde se negoció la mayor cantidad de volumen. La etiqueta superior del gráfico nos muestra el precio más alto, y la última etiqueta que observamos es la de la parte baja, que nos indica el precio más bajo. Estas etiquetas mostradas están dentro del rango de retroceso definido o Lookback Length, que podemos configurar en nuestro indicador. Como observamos, el algoritmo nos muestra una fuerte confluencia entre los canales de soporte Macro y las etiquetas del perfil de volumen, lo que nos confirma las áreas más fuertes del rango.
Por último, después de hacer los cálculos de soportes y resistencias desde tres perspectivas distintas, el algoritmo nos proporciona una visión macro del precio en forma de líneas de tendencia. Es decir, nos muestra soportes y resistencias en forma de canales diagonales donde tendremos representadas las tendencias en el mercado y áreas en las que el precio históricamente ha encontrado dificultades para avanzar o retroceder, lo que podemos corroborar con los soportes y resistencias de los que hablamos al principio.
Como observamos en el gráfico, el algoritmo también nos muestra las etiquetas con el precio exacto donde se encuentran los soportes angulares del precio y las resistencias angulares. Estos cálculos son importantísimos, ya que nos ofrecen una perspectiva de tendencia y podemos tener una visión de hacia dónde se dirige el precio, combinando estos con los otros cálculos de soportes y resistencias.
Recuerden que todos los cálculos anteriores tienen su propia alerta para cuando los soportes o resistencias se quiebren o en su caso, se creen nuevos canales, también cuando haya una ruptura de un "box" o una confirmación de ruptura.
El segundo tipo de alerta del indicador está configurada para que nuestros indicadores trabajen para nosotros sin necesidad de estar presentes en el gráfico, esto mediante una programación especial dentro del código del indicador que realizará compras y ventas automáticas en nuestro Exchange de preferencia mediante una alerta configurada para el bot 3Commas. Solo bastará con que pongamos nuestro número de Bot o Bot ID que da el proveedor de 3Commas y lo insertemos en la alerta. Todos los indicadores premium tienen en su configuración una explicación detallada sobre dónde poner tus Bot ID.
Normalized, Variety, Fast Fourier Transform Explorer [Loxx]Normalized, Variety, Fast Fourier Transform Explorer demonstrates Real, Cosine, and Sine Fast Fourier Transform algorithms. This indicator can be used as a rule of thumb but shouldn't be used in trading.
What is the Discrete Fourier Transform?
In mathematics, the discrete Fourier transform (DFT) converts a finite sequence of equally-spaced samples of a function into a same-length sequence of equally-spaced samples of the discrete-time Fourier transform (DTFT), which is a complex-valued function of frequency. The interval at which the DTFT is sampled is the reciprocal of the duration of the input sequence. An inverse DFT is a Fourier series, using the DTFT samples as coefficients of complex sinusoids at the corresponding DTFT frequencies. It has the same sample-values as the original input sequence. The DFT is therefore said to be a frequency domain representation of the original input sequence. If the original sequence spans all the non-zero values of a function, its DTFT is continuous (and periodic), and the DFT provides discrete samples of one cycle. If the original sequence is one cycle of a periodic function, the DFT provides all the non-zero values of one DTFT cycle.
What is the Complex Fast Fourier Transform?
The complex Fast Fourier Transform algorithm transforms N real or complex numbers into another N complex numbers. The complex FFT transforms a real or complex signal x in the time domain into a complex two-sided spectrum X in the frequency domain. You must remember that zero frequency corresponds to n = 0, positive frequencies 0 < f < f_c correspond to values 1 ≤ n ≤ N/2 −1, while negative frequencies −fc < f < 0 correspond to N/2 +1 ≤ n ≤ N −1. The value n = N/2 corresponds to both f = f_c and f = −f_c. f_c is the critical or Nyquist frequency with f_c = 1/(2*T) or half the sampling frequency. The first harmonic X corresponds to the frequency 1/(N*T).
The complex FFT requires the list of values (resolution, or N) to be a power 2. If the input size if not a power of 2, then the input data will be padded with zeros to fit the size of the closest power of 2 upward.
What is Real-Fast Fourier Transform?
Has conditions similar to the complex Fast Fourier Transform value, except that the input data must be purely real. If the time series data has the basic type complex64, only the real parts of the complex numbers are used for the calculation. The imaginary parts are silently discarded.
What is the Real-Fast Fourier Transform?
In many applications, the input data for the DFT are purely real, in which case the outputs satisfy the symmetry
X(N-k)=X(k)
and efficient FFT algorithms have been designed for this situation (see e.g. Sorensen, 1987). One approach consists of taking an ordinary algorithm (e.g. Cooley–Tukey) and removing the redundant parts of the computation, saving roughly a factor of two in time and memory. Alternatively, it is possible to express an even-length real-input DFT as a complex DFT of half the length (whose real and imaginary parts are the even/odd elements of the original real data), followed by O(N) post-processing operations.
It was once believed that real-input DFTs could be more efficiently computed by means of the discrete Hartley transform (DHT), but it was subsequently argued that a specialized real-input DFT algorithm (FFT) can typically be found that requires fewer operations than the corresponding DHT algorithm (FHT) for the same number of inputs. Bruun's algorithm (above) is another method that was initially proposed to take advantage of real inputs, but it has not proved popular.
There are further FFT specializations for the cases of real data that have even/odd symmetry, in which case one can gain another factor of roughly two in time and memory and the DFT becomes the discrete cosine/sine transform(s) (DCT/DST). Instead of directly modifying an FFT algorithm for these cases, DCTs/DSTs can also be computed via FFTs of real data combined with O(N) pre- and post-processing.
What is the Discrete Cosine Transform?
A discrete cosine transform ( DCT ) expresses a finite sequence of data points in terms of a sum of cosine functions oscillating at different frequencies. The DCT , first proposed by Nasir Ahmed in 1972, is a widely used transformation technique in signal processing and data compression. It is used in most digital media, including digital images (such as JPEG and HEIF, where small high-frequency components can be discarded), digital video (such as MPEG and H.26x), digital audio (such as Dolby Digital, MP3 and AAC ), digital television (such as SDTV, HDTV and VOD ), digital radio (such as AAC+ and DAB+), and speech coding (such as AAC-LD, Siren and Opus). DCTs are also important to numerous other applications in science and engineering, such as digital signal processing, telecommunication devices, reducing network bandwidth usage, and spectral methods for the numerical solution of partial differential equations.
The use of cosine rather than sine functions is critical for compression, since it turns out (as described below) that fewer cosine functions are needed to approximate a typical signal, whereas for differential equations the cosines express a particular choice of boundary conditions. In particular, a DCT is a Fourier-related transform similar to the discrete Fourier transform (DFT), but using only real numbers. The DCTs are generally related to Fourier Series coefficients of a periodically and symmetrically extended sequence whereas DFTs are related to Fourier Series coefficients of only periodically extended sequences. DCTs are equivalent to DFTs of roughly twice the length, operating on real data with even symmetry (since the Fourier transform of a real and even function is real and even), whereas in some variants the input and/or output data are shifted by half a sample. There are eight standard DCT variants, of which four are common.
The most common variant of discrete cosine transform is the type-II DCT , which is often called simply "the DCT". This was the original DCT as first proposed by Ahmed. Its inverse, the type-III DCT , is correspondingly often called simply "the inverse DCT" or "the IDCT". Two related transforms are the discrete sine transform ( DST ), which is equivalent to a DFT of real and odd functions, and the modified discrete cosine transform (MDCT), which is based on a DCT of overlapping data. Multidimensional DCTs ( MD DCTs) are developed to extend the concept of DCT to MD signals. There are several algorithms to compute MD DCT . A variety of fast algorithms have been developed to reduce the computational complexity of implementing DCT . One of these is the integer DCT (IntDCT), an integer approximation of the standard DCT ,: ix, xiii, 1, 141–304 used in several ISO /IEC and ITU-T international standards.
What is the Discrete Sine Transform?
In mathematics, the discrete sine transform (DST) is a Fourier-related transform similar to the discrete Fourier transform (DFT), but using a purely real matrix. It is equivalent to the imaginary parts of a DFT of roughly twice the length, operating on real data with odd symmetry (since the Fourier transform of a real and odd function is imaginary and odd), where in some variants the input and/or output data are shifted by half a sample.
A family of transforms composed of sine and sine hyperbolic functions exists. These transforms are made based on the natural vibration of thin square plates with different boundary conditions.
The DST is related to the discrete cosine transform (DCT), which is equivalent to a DFT of real and even functions. See the DCT article for a general discussion of how the boundary conditions relate the various DCT and DST types. Generally, the DST is derived from the DCT by replacing the Neumann condition at x=0 with a Dirichlet condition. Both the DCT and the DST were described by Nasir Ahmed T. Natarajan and K.R. Rao in 1974. The type-I DST (DST-I) was later described by Anil K. Jain in 1976, and the type-II DST (DST-II) was then described by H.B. Kekra and J.K. Solanka in 1978.
Notable settings
windowper = period for calculation, restricted to powers of 2: "16", "32", "64", "128", "256", "512", "1024", "2048", this reason for this is FFT is an algorithm that computes DFT (Discrete Fourier Transform) in a fast way, generally in 𝑂(𝑁⋅log2(𝑁)) instead of 𝑂(𝑁2). To achieve this the input matrix has to be a power of 2 but many FFT algorithm can handle any size of input since the matrix can be zero-padded. For our purposes here, we stick to powers of 2 to keep this fast and neat. read more about this here: Cooley–Tukey FFT algorithm
SS = smoothing count, this smoothing happens after the first FCT regular pass. this zeros out frequencies from the previously calculated values above SS count. the lower this number, the smoother the output, it works opposite from other smoothing periods
Fmin1 = zeroes out frequencies not passing this test for min value
Fmax1 = zeroes out frequencies not passing this test for max value
barsback = moves the window backward
Inverse = whether or not you wish to invert the FFT after first pass calculation
Related indicators
Real-Fast Fourier Transform of Price Oscillator
STD-Stepped Fast Cosine Transform Moving Average
Real-Fast Fourier Transform of Price w/ Linear Regression
Variety RSI of Fast Discrete Cosine Transform
Additional reading
A Fast Computational Algorithm for the Discrete Cosine Transform by Chen et al.
Practical Fast 1-D DCT Algorithms With 11 Multiplications by Loeffler et al.
Cooley–Tukey FFT algorithm
Ahmed, Nasir (January 1991). "How I Came Up With the Discrete Cosine Transform". Digital Signal Processing. 1 (1): 4–5. doi:10.1016/1051-2004(91)90086-Z.
DCT-History - How I Came Up With The Discrete Cosine Transform
Comparative Analysis for Discrete Sine Transform as a suitable method for noise estimation
Helme-Nikias Weighted Burg AR-SE Extra. of Price [Loxx]Helme-Nikias Weighted Burg AR-SE Extra. of Price is an indicator that uses an autoregressive spectral estimation called the Weighted Burg Algorithm, but unlike the usual WB algo, this one uses Helme-Nikias weighting. This method is commonly used in speech modeling and speech prediction engines. This is a linear method of forecasting data. You'll notice that this method uses a different weighting calculation vs Weighted Burg method. This new weighting is the following:
w = math.pow(array.get(x, i - 1), 2), the squared lag of the source parameter
and
w += math.pow(array.get(x, i), 2), the sum of the squared source parameter
This take place of the rectangular, hamming and parabolic weighting used in the Weighted Burg method
Also, this method includes Levinson–Durbin algorithm. as was already discussed previously in the following indicator:
Levinson-Durbin Autocorrelation Extrapolation of Price
What is Helme-Nikias Weighted Burg Autoregressive Spectral Estimate Extrapolation of price?
In this paper a new stable modification of the weighted Burg technique for autoregressive (AR) spectral estimation is introduced based on data-adaptive weights that are proportional to the common power of the forward and backward AR process realizations. It is shown that AR spectra of short length sinusoidal signals generated by the new approach do not exhibit phase dependence or line-splitting. Further, it is demonstrated that improvements in resolution may be so obtained relative to other weighted Burg algorithms. The method suggested here is shown to resolve two closely-spaced peaks of dynamic range 24 dB whereas the modified Burg schemes employing rectangular, Hamming or "optimum" parabolic windows fail.
Data inputs
Source Settings: -Loxx's Expanded Source Types. You typically use "open" since open has already closed on the current active bar
LastBar - bar where to start the prediction
PastBars - how many bars back to model
LPOrder - order of linear prediction model; 0 to 1
FutBars - how many bars you want to forward predict
Things to know
Normally, a simple moving average is calculated on source data. I've expanded this to 38 different averaging methods using Loxx's Moving Avreages.
This indicator repaints
Further reading
A high-resolution modified Burg algorithm for spectral estimation
Related Indicators
Levinson-Durbin Autocorrelation Extrapolation of Price
Weighted Burg AR Spectral Estimate Extrapolation of Price
MIDAS VWAP Jayy his is just a bash together of two MIDAS VWAP scripts particularly AkifTokuz and drshoe.
I added the ability to show more MIDAS curves from the same script.
The algorithm primarily uses the "n" number but the date can be used for the 8th VWAP
I have not converted the script to version 3.
To find bar number go into "Chart Properties" select " "background" then select Indicator Titles and "Indicator values". When you place your cursor over a bar the first number you see adjacent to the script title is the bar number. Put that in the dialogue box midline is MIDAS VWAP . The resistance is a MIDAS VWAP using bar highs. The resistance is MIDAS VWAP using bar lows.
In most case using N will suffice. However, if you are flipping around charts inputting a specific date can be handy. In this way, you can compare the same point in time across multiple instruments eg first trading day of the year or an election date.
Adding dates into the dialogue box is a bit cumbersome so in this version, it is enabled for only one curve. I have called it VWAP and it follows the typical VWAP algorithm. (Does that make a difference? Read below re my opinion on the Difference between MIDAS VWAP and VWAP ).
I have added the ability to start from the bottom or top of the initiating bar.
In theory in a probable uptrend pick a low of a bar for a low pivot and start the MIDAS VWAP there using the support.
For a downtrend use the high pivot bar and select resistance. The way to see is to play with these values.
Difference between MIDAS VWAP and the regular VWAP
MIDAS itself as described by Levine uses a time anchored On-Balance Volume (OBV) plotted on a graph where the horizontal (abscissa) arm of the graph is cumulative volume not time. He called his VWAP curves Support/Resistance VWAP or S/R curves. These S/R curves are often referred to as "MIDAS curves".
These are the main components of the MIDAS chart. A third algorithm called the Top-Bottom Finder was also described. (Separate script).
Additional tools have been described in "MIDAS_Technical_Analysis"
Midas Technical Analysis: A VWAP Approach to Trading and Investing in Today’s Markets by Andrew Coles, David G. Hawkins
Copyright © 2011 by Andrew Coles and David G. Hawkins.
Denoting the different way in which Levine approached the calculation.
The difference between "MIDAS" VWAP and VWAP is, in my opinion, much ado about nothing. The algorithms generate identical curves albeit the MIDAS algorithm launches the curve one bar later than the VWAP algorithm which can be a pain in the neck. All of the algorithms that I looked at on Tradingview step back one bar in time to initiate the MIDAS curve. As such the plotted curves are identical to traditional VWAP assuming the initiation is from the candle/bar midpoint.
How did Levine intend the curves to be drawn?
On a reversal, he suggested the initiation of the Support and Resistance VVWAP (S/R curve) to be started after a reversal.
It is clear in his examples this happens occasionally but in many cases he initiates the so-called MIDAS S/R VWAP right at the reversal point. In any case, the algorithm is problematic if you wish to start a curve on the first bar of an IPO .
You will get nothing. That is a pain. Also in Levine's writings, he describes simply clicking on the point where a
S/R VWAP is to be drawn from. As such, the generally accepted method of initiating the curve at N-1 is a practical and sensible method. The only issue is that you cannot draw the curve from the first bar on any security, as mentioned without resorting to the typical VWAP algorithm. There is another difference. VWAP is launched from the middle of the bar (as per AlphaTrends), You can also launch from the top of the bar or the bottom (or anywhere for that matter). The calculation proceeds using the top or bottom for each new bar.
The potential applications are discussed in the MIDAS Technical Analysis book.
TA█ TA Library
📊 OVERVIEW
TA is a Pine Script technical analysis library. This library provides 25+ moving averages and smoothing filters , from classic SMA/EMA to Kalman Filters and adaptive algorithms, implemented based on academic research.
🎯 Core Features
Academic Based - Algorithms follow original papers and formulas
Performance Optimized - Pre-calculated constants for faster response
Unified Interface - Consistent function design
Research Based - Integrates technical analysis research
🎯 CONCEPTS
Library Design Philosophy
This technical analysis library focuses on providing:
Academic Foundation
Algorithms based on published research papers and academic standards
Implementations that follow original mathematical formulations
Clear documentation with research references
Developer Experience
Unified interface design for consistent usage patterns
Pre-calculated constants for optimal performance
Comprehensive function collection to reduce development time
Single import statement for immediate access to all functions
Each indicator encapsulated as a simple function call - one line of code simplifies complexity
Technical Excellence
25+ carefully implemented moving averages and filters
Support for advanced algorithms like Kalman Filter and MAMA/FAMA
Optimized code structure for maintainability and reliability
Regular updates incorporating latest research developments
🚀 USING THIS LIBRARY
Import Library
//@version=6
import DCAUT/TA/1 as dta
indicator("Advanced Technical Analysis", overlay=true)
Basic Usage Example
// Classic moving average combination
ema20 = ta.ema(close, 20)
kama20 = dta.kama(close, 20)
plot(ema20, "EMA20", color.red, 2)
plot(kama20, "KAMA20", color.green, 2)
Advanced Trading System
// Adaptive moving average system
kama = dta.kama(close, 20, 2, 30)
= dta.mamaFama(close, 0.5, 0.05)
// Trend confirmation and entry signals
bullTrend = kama > kama and mamaValue > famaValue
bearTrend = kama < kama and mamaValue < famaValue
longSignal = ta.crossover(close, kama) and bullTrend
shortSignal = ta.crossunder(close, kama) and bearTrend
plot(kama, "KAMA", color.blue, 3)
plot(mamaValue, "MAMA", color.orange, 2)
plot(famaValue, "FAMA", color.purple, 2)
plotshape(longSignal, "Buy", shape.triangleup, location.belowbar, color.green)
plotshape(shortSignal, "Sell", shape.triangledown, location.abovebar, color.red)
📋 FUNCTIONS REFERENCE
ewma(source, alpha)
Calculates the Exponentially Weighted Moving Average with dynamic alpha parameter.
Parameters:
source (series float) : Series of values to process.
alpha (series float) : The smoothing parameter of the filter.
Returns: (float) The exponentially weighted moving average value.
dema(source, length)
Calculates the Double Exponential Moving Average (DEMA) of a given data series.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Number of bars for the moving average calculation.
Returns: (float) The calculated Double Exponential Moving Average value.
tema(source, length)
Calculates the Triple Exponential Moving Average (TEMA) of a given data series.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Number of bars for the moving average calculation.
Returns: (float) The calculated Triple Exponential Moving Average value.
zlema(source, length)
Calculates the Zero-Lag Exponential Moving Average (ZLEMA) of a given data series. This indicator attempts to eliminate the lag inherent in all moving averages.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Number of bars for the moving average calculation.
Returns: (float) The calculated Zero-Lag Exponential Moving Average value.
tma(source, length)
Calculates the Triangular Moving Average (TMA) of a given data series. TMA is a double-smoothed simple moving average that reduces noise.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Number of bars for the moving average calculation.
Returns: (float) The calculated Triangular Moving Average value.
frama(source, length)
Calculates the Fractal Adaptive Moving Average (FRAMA) of a given data series. FRAMA adapts its smoothing factor based on fractal geometry to reduce lag. Developed by John Ehlers.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Number of bars for the moving average calculation.
Returns: (float) The calculated Fractal Adaptive Moving Average value.
kama(source, length, fastLength, slowLength)
Calculates Kaufman's Adaptive Moving Average (KAMA) of a given data series. KAMA adjusts its smoothing based on market efficiency ratio. Developed by Perry J. Kaufman.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Number of bars for the efficiency calculation.
fastLength (simple int) : Fast EMA length for smoothing calculation. Optional. Default is 2.
slowLength (simple int) : Slow EMA length for smoothing calculation. Optional. Default is 30.
Returns: (float) The calculated Kaufman's Adaptive Moving Average value.
t3(source, length, volumeFactor)
Calculates the Tilson Moving Average (T3) of a given data series. T3 is a triple-smoothed exponential moving average with improved lag characteristics. Developed by Tim Tillson.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Number of bars for the moving average calculation.
volumeFactor (simple float) : Volume factor affecting responsiveness. Optional. Default is 0.7.
Returns: (float) The calculated Tilson Moving Average value.
ultimateSmoother(source, length)
Calculates the Ultimate Smoother of a given data series. Uses advanced filtering techniques to reduce noise while maintaining responsiveness. Based on digital signal processing principles by John Ehlers.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Number of bars for the smoothing calculation.
Returns: (float) The calculated Ultimate Smoother value.
kalmanFilter(source, processNoise, measurementNoise)
Calculates the Kalman Filter of a given data series. Optimal estimation algorithm that estimates true value from noisy observations. Based on the Kalman Filter algorithm developed by Rudolf Kalman (1960).
Parameters:
source (series float) : Series of values to process.
processNoise (simple float) : Process noise variance (Q). Controls adaptation speed. Optional. Default is 0.05.
measurementNoise (simple float) : Measurement noise variance (R). Controls smoothing. Optional. Default is 1.0.
Returns: (float) The calculated Kalman Filter value.
mcginleyDynamic(source, length)
Calculates the McGinley Dynamic of a given data series. McGinley Dynamic is an adaptive moving average that adjusts to market speed changes. Developed by John R. McGinley Jr.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Number of bars for the dynamic calculation.
Returns: (float) The calculated McGinley Dynamic value.
mama(source, fastLimit, slowLimit)
Calculates the Mesa Adaptive Moving Average (MAMA) of a given data series. MAMA uses Hilbert Transform Discriminator to adapt to market cycles dynamically. Developed by John F. Ehlers.
Parameters:
source (series float) : Series of values to process.
fastLimit (simple float) : Maximum alpha (responsiveness). Optional. Default is 0.5.
slowLimit (simple float) : Minimum alpha (smoothing). Optional. Default is 0.05.
Returns: (float) The calculated Mesa Adaptive Moving Average value.
fama(source, fastLimit, slowLimit)
Calculates the Following Adaptive Moving Average (FAMA) of a given data series. FAMA follows MAMA with reduced responsiveness for crossover signals. Developed by John F. Ehlers.
Parameters:
source (series float) : Series of values to process.
fastLimit (simple float) : Maximum alpha (responsiveness). Optional. Default is 0.5.
slowLimit (simple float) : Minimum alpha (smoothing). Optional. Default is 0.05.
Returns: (float) The calculated Following Adaptive Moving Average value.
mamaFama(source, fastLimit, slowLimit)
Calculates Mesa Adaptive Moving Average (MAMA) and Following Adaptive Moving Average (FAMA).
Parameters:
source (series float) : Series of values to process.
fastLimit (simple float) : Maximum alpha (responsiveness). Optional. Default is 0.5.
slowLimit (simple float) : Minimum alpha (smoothing). Optional. Default is 0.05.
Returns: ( ) Tuple containing values.
laguerreFilter(source, length, gamma, order)
Calculates the standard N-order Laguerre Filter of a given data series. Standard Laguerre Filter uses uniform weighting across all polynomial terms. Developed by John F. Ehlers.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Length for UltimateSmoother preprocessing.
gamma (simple float) : Feedback coefficient (0-1). Lower values reduce lag. Optional. Default is 0.8.
order (simple int) : The order of the Laguerre filter (1-10). Higher order increases lag. Optional. Default is 8.
Returns: (float) The calculated standard Laguerre Filter value.
laguerreBinomialFilter(source, length, gamma)
Calculates the Laguerre Binomial Filter of a given data series. Uses 6-pole feedback with binomial weighting coefficients. Developed by John F. Ehlers.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Length for UltimateSmoother preprocessing.
gamma (simple float) : Feedback coefficient (0-1). Lower values reduce lag. Optional. Default is 0.5.
Returns: (float) The calculated Laguerre Binomial Filter value.
superSmoother(source, length)
Calculates the Super Smoother of a given data series. SuperSmoother is a second-order Butterworth filter from aerospace technology. Developed by John F. Ehlers.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Period for the filter calculation.
Returns: (float) The calculated Super Smoother value.
rangeFilter(source, length, multiplier)
Calculates the Range Filter of a given data series. Range Filter reduces noise by filtering price movements within a dynamic range.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Number of bars for the average range calculation.
multiplier (simple float) : Multiplier for the smooth range. Higher values increase filtering. Optional. Default is 2.618.
Returns: ( ) Tuple containing filtered value, trend direction, upper band, and lower band.
qqe(source, rsiLength, rsiSmooth, qqeFactor)
Calculates the Quantitative Qualitative Estimation (QQE) of a given data series. QQE is an improved RSI that reduces noise and provides smoother signals. Developed by Igor Livshin.
Parameters:
source (series float) : Series of values to process.
rsiLength (simple int) : Number of bars for the RSI calculation. Optional. Default is 14.
rsiSmooth (simple int) : Number of bars for smoothing the RSI. Optional. Default is 5.
qqeFactor (simple float) : QQE factor for volatility band width. Optional. Default is 4.236.
Returns: ( ) Tuple containing smoothed RSI and QQE trend line.
sslChannel(source, length)
Calculates the Semaphore Signal Level (SSL) Channel of a given data series. SSL Channel provides clear trend signals using moving averages of high and low prices.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Number of bars for the moving average calculation.
Returns: ( ) Tuple containing SSL Up and SSL Down lines.
ma(source, length, maType)
Calculates a Moving Average based on the specified type. Universal interface supporting all moving average algorithms.
Parameters:
source (series float) : Series of values to process.
length (simple int) : Number of bars for the moving average calculation.
maType (simple MaType) : Type of moving average to calculate. Optional. Default is SMA.
Returns: (float) The calculated moving average value based on the specified type.
atr(length, maType)
Calculates the Average True Range (ATR) using the specified moving average type. Developed by J. Welles Wilder Jr.
Parameters:
length (simple int) : Number of bars for the ATR calculation.
maType (simple MaType) : Type of moving average to use for smoothing. Optional. Default is RMA.
Returns: (float) The calculated Average True Range value.
macd(source, fastLength, slowLength, signalLength, maType, signalMaType)
Calculates the Moving Average Convergence Divergence (MACD) with customizable MA types. Developed by Gerald Appel.
Parameters:
source (series float) : Series of values to process.
fastLength (simple int) : Period for the fast moving average.
slowLength (simple int) : Period for the slow moving average.
signalLength (simple int) : Period for the signal line moving average.
maType (simple MaType) : Type of moving average for main MACD calculation. Optional. Default is EMA.
signalMaType (simple MaType) : Type of moving average for signal line calculation. Optional. Default is EMA.
Returns: ( ) Tuple containing MACD line, signal line, and histogram values.
dmao(source, fastLength, slowLength, maType)
Calculates the Dual Moving Average Oscillator (DMAO) of a given data series. Uses the same algorithm as the Percentage Price Oscillator (PPO), but can be applied to any data series.
Parameters:
source (series float) : Series of values to process.
fastLength (simple int) : Period for the fast moving average.
slowLength (simple int) : Period for the slow moving average.
maType (simple MaType) : Type of moving average to use for both calculations. Optional. Default is EMA.
Returns: (float) The calculated Dual Moving Average Oscillator value as a percentage.
continuationIndex(source, length, gamma, order)
Calculates the Continuation Index of a given data series. The index represents the Inverse Fisher Transform of the normalized difference between an UltimateSmoother and an N-order Laguerre filter. Developed by John F. Ehlers, published in TASC 2025.09.
Parameters:
source (series float) : Series of values to process.
length (simple int) : The calculation length.
gamma (simple float) : Controls the phase response of the Laguerre filter. Optional. Default is 0.8.
order (simple int) : The order of the Laguerre filter (1-10). Optional. Default is 8.
Returns: (float) The calculated Continuation Index value.
📚 RELEASE NOTES
v1.0 (2025.09.24)
✅ 25+ technical analysis functions
✅ Complete adaptive moving average series (KAMA, FRAMA, MAMA/FAMA)
✅ Advanced signal processing filters (Kalman, Laguerre, SuperSmoother, UltimateSmoother)
✅ Performance optimized with pre-calculated constants and efficient algorithms
✅ Unified function interface design following TradingView best practices
✅ Comprehensive moving average collection (DEMA, TEMA, ZLEMA, T3, etc.)
✅ Volatility and trend detection tools (QQE, SSL Channel, Range Filter)
✅ Continuation Index - Latest research from TASC 2025.09
✅ MACD and ATR calculations supporting multiple moving average types
✅ Dual Moving Average Oscillator (DMAO) for arbitrary data series analysis
Multi-Timeframe Wolfe Wave StrategyThis invite-only strategy implements an advanced multi-timeframe Wolfe Wave pattern recognition system specifically designed for institutional-grade algorithmic trading environments.
**Core Mathematical Framework:**
The strategy employs sophisticated mathematical calculations across 10 distinct timeframes (377, 233, 144, 89, 55, 34, 21, 13, 8, 5 periods), utilizing Elliott Wave ratio theory combined with proprietary algorithmic enhancements. Unlike standard Wolfe Wave implementations that rely on visual pattern recognition, this system uses quantitative analysis to identify precise entry and exit points.
**Technical Implementation:**
• **Pattern Detection Algorithm:** Calculates price relationships using configurable ratio sets including Fibonacci sequences, Elliott Wave ratios, Golden Ratio, Harmonic Patterns, Pi-based calculations, and custom mathematical progressions
• **Multi-Timeframe Confluence:** Simultaneously analyzes patterns across all timeframes to ensure signal reliability and reduce false positives
• **Dynamic Target Calculation:** Employs advanced mathematical modeling to project optimal profit targets based on historical price behavior and pattern completion theory
• **Risk Management Engine:** Implements position-based stop losses calculated as percentages of target profits, with liquidation price monitoring for leveraged positions
**Originality and Innovation:**
This implementation differs significantly from traditional Wolfe Wave indicators through several key innovations:
1. **Algorithmic Pattern Validation:** Uses mathematical confirmation across multiple timeframes rather than subjective visual analysis
2. **Adaptive Ratio Selection:** Offers 24 different ratio calculation methods, allowing optimization for various market conditions
3. **Institutional Integration:** Features comprehensive webhook messaging for automated execution via external trading systems
4. **Advanced Position Management:** Includes sophisticated position sizing controls with maximum concurrent position limits
**Strategy Logic:**
For bullish conditions, the algorithm identifies when price action meets specific mathematical criteria:
- Point validation through ratio analysis between swing highs/lows
- Confluence confirmation across multiple timeframes
- Minimum profit threshold filtering to ensure trade quality
- Dynamic stop-loss positioning based on pattern geometry
The mathematical approach uses proprietary calculations that extend beyond traditional Fibonacci levels, incorporating elements from chaos theory, fractal geometry, and advanced statistical analysis.
**Risk Management Features:**
• Configurable stop-loss percentages relative to profit targets
• Maximum position limits to control portfolio exposure
• Liquidation price monitoring for margin trading
• Time-based filtering options for market session control
• Minimum profit threshold settings to filter low-quality signals
**Intended Markets and Conditions:**
Optimized for cryptocurrency markets with high volatility and sufficient liquidity. Works effectively in trending and ranging market conditions due to its multi-timeframe approach. Best suited for assets with clear swing structure and adequate price movement.
**Performance Characteristics:**
The strategy is designed for active trading with frequent position entries across multiple timeframes. Position holding periods vary from short-term scalping to medium-term swing trading depending on pattern completion timeframes.
**Technical Requirements:**
Requires understanding of advanced pattern recognition theory, risk management principles, and algorithmic trading concepts. Users should be familiar with Wolfe Wave methodology and Elliott Wave theory fundamentals.
Trend Following Strategy with KNN
### 1. Strategy Features
This strategy combines the K-Nearest Neighbors (KNN) algorithm with a trend-following strategy to predict future price movements by analyzing historical price data. Here are the main features of the strategy:
1. **Dynamic Parameter Adjustment**: Uses the KNN algorithm to dynamically adjust parameters of the trend-following strategy, such as moving average length and channel length, to adapt to market changes.
2. **Trend Following**: Captures market trends using moving averages and price channels to generate buy and sell signals.
3. **Multi-Factor Analysis**: Combines the KNN algorithm with moving averages to comprehensively analyze the impact of multiple factors, improving the accuracy of trading signals.
4. **High Adaptability**: Automatically adjusts parameters using the KNN algorithm, allowing the strategy to adapt to different market environments and asset types.
### 2. Simple Introduction to the KNN Algorithm
The K-Nearest Neighbors (KNN) algorithm is a simple and intuitive machine learning algorithm primarily used for classification and regression problems. Here are the basic concepts of the KNN algorithm:
1. **Non-Parametric Model**: KNN is a non-parametric algorithm, meaning it does not make any assumptions about the data distribution. Instead, it directly uses training data for predictions.
2. **Instance-Based Learning**: KNN is an instance-based learning method that uses training data directly for predictions, rather than generating a model through a training process.
3. **Distance Metrics**: The core of the KNN algorithm is calculating the distance between data points. Common distance metrics include Euclidean distance, Manhattan distance, and Minkowski distance.
4. **Neighbor Selection**: For each test data point, the KNN algorithm finds the K nearest neighbors in the training dataset.
5. **Classification and Regression**: In classification problems, KNN determines the class of a test data point through a voting mechanism. In regression problems, KNN predicts the value of a test data point by calculating the average of the K nearest neighbors.
### 3. Applications of the KNN Algorithm in Quantitative Trading Strategies
The KNN algorithm can be applied to various quantitative trading strategies. Here are some common use cases:
1. **Trend-Following Strategies**: KNN can be used to identify market trends, helping traders capture the beginning and end of trends.
2. **Mean Reversion Strategies**: In mean reversion strategies, KNN can be used to identify price deviations from the mean.
3. **Arbitrage Strategies**: In arbitrage strategies, KNN can be used to identify price discrepancies between different markets or assets.
4. **High-Frequency Trading Strategies**: In high-frequency trading strategies, KNN can be used to quickly identify market anomalies, such as price spikes or volume anomalies.
5. **Event-Driven Strategies**: In event-driven strategies, KNN can be used to identify the impact of market events.
6. **Multi-Factor Strategies**: In multi-factor strategies, KNN can be used to comprehensively analyze the impact of multiple factors.
### 4. Final Considerations
1. **Computational Efficiency**: The KNN algorithm may face computational efficiency issues with large datasets, especially in real-time trading. Optimize the code to reduce access to historical data and improve computational efficiency.
2. **Parameter Selection**: The choice of K value significantly affects the performance of the KNN algorithm. Use cross-validation or other methods to select the optimal K value.
3. **Data Standardization**: KNN is sensitive to data standardization and feature selection. Standardize the data to ensure equal weighting of different features.
4. **Noisy Data**: KNN is sensitive to noisy data, which can lead to overfitting. Preprocess the data to remove noise.
5. **Market Environment**: The effectiveness of the KNN algorithm may be influenced by market conditions. Combine it with other technical indicators and fundamental analysis to enhance the robustness of the strategy.
Ocs Ai TraderThis script perform predictive analytics from a virtual trader perspective!
It acts as an AI Trade Assistant that helps you decide the optimal times to buy or sell securities, providing you with precise target prices and stop-loss level to optimise your gains and manage risk effectively.
System Components
The trading system is built on 4 fundamental layers :
Time series Processing layer
Signal Processing layer
Machine Learning
Virtual Trade Emulator
Time series Processing layer
This is first component responsible for handling and processing real-time and historical time series data.
In this layer Signals are extracted from
averages such as : volume price mean, adaptive moving average
Estimates such as : relative strength stochastics estimates on supertrend
Signal Processing layer
This second layer processes signals from previous layer using sensitivity filter comprising of an Probability Distribution Confidence Filter
The main purpose here is to predict the trend of the underlying, by converging price, volume signals and deltas over a dominant cycle as dimensions and generate signals of action.
Key terms
Dominant cycle is a time cycle that has a greater influence on the overall behaviour of a system than other cycles.
The system uses Ehlers method to calculate Dominant Cycle/ Period.
Dominant cycle is used to determine the influencing period for the underlying.
Once the dominant cycle/ period is identified, it is treated as a dynamic length for considering further calculations
Predictive Adaptive Filter to generate Signals and define Targets and Stops
An adaptive filter is a system with a linear filter that has a transfer function controlled by variable parameters and a means to adjust those parameters according to an optimisation algorithm. Because of the complexity of the optimisation algorithms, almost all adaptive filters are digital filters. Thus Helping us classify our intent either long side or short side
The indicator use Adaptive Least mean square algorithm, for convergence of the filtered signals into a category of intents, (either buy or sell)
Machine Learning
The third layer of the System performs classifications using KNN K-Nearest Neighbour is one of the simplest Machine Learning algorithms based on Supervised Learning technique.
K-NN algorithm assumes the similarity between the new case/data and available cases and put the new case into the category that is most similar to the available categories.
K-NN algorithm stores all the available data and classifies a new data point based on the similarity. This means when new data appears then it can be easily classified into a well suite category by using K- NN algorithm. K-NN algorithm can be used for Regression as well as for Classification but mostly it is used for the Classification problems.
Virtual Trade Emulator
In this last and fourth layer a trade assistant is coded using trade emulation techniques and the Lines and Labels for Buy / Sell Signals, Targets and Stop are forecasted!
How to use
The system generates Buy and Sell alerts and plots it on charts
Buy signal
Buy signal constitutes of three targets {namely T1, T2, T3} and one stop level
Sell signal
Sell signal constitutes of three targets {namely T1, T2, T3} and one stop level
What Securities will it work upon ?
Volume Informations must be present for the applied security
The indicator works on every liquid security : stocks, future, forex, crypto, options, commodities
What TimeFrames To Use ?
You can use any Timeframe, The indicator is Adaptive in Nature,
I personally use timeframes such as : 1m, 5m 10m, 15m, ..... 1D, 1W
This Script Uses Tradingview Premium features for working on lower timeframes
In case if you are not a Tradingview premium subscriber you should tell the script that after applying on chart, this can be done by going to settings and unchecking "Is your Tradingview Subscription Premium or Above " Option
How To Get Access ?
You will need to privately message me for access mentioning you want access to "Ocs Ai Trader" Use comment box only for constructive comments. Thanks !
Bist Manipulation [Projeadam]
OVERVIEW | GENEL BAKIŞ
ENG: Indicator that detects manipulation candles according to changing market conditions.
TR: Değişen piyasa koşullarına göre manipülasyon mumlarını tespit eden gösterge.
ENG: IMPORTANT NOTE: This indicator works in BIST Market and only in Future Parities.
Example ->> PETKM1! --SASA1!
TR: ÖNEMLİ NOT: Bu indikatör BİST Piyasasında ve sadece Future Paritelerde Çalışır.
Örnek- >> PETKM1! -- SASA1!
ENG: Market makers manipulate the market because most people who trade on the stock exchange act with their emotions and are forced to close the transaction at a loss.
TR: Piyasada market yapıcı oluşumlar manipülasyon yaparlar çünkü borsada işlem alan insanların birçoğu duygularıyla hareket eder ve zararla işlem kapatılmaya zorlanır.
ENG: If we detect manipulation candles in the market, we can control our fragile psychology and close our transactions in profit by trading with market-making formations in these areas.
TR: Marketde manipülasyon mumlarını tespit edersek kırılgan psikolojimizi kontrol edebilir ve bu alanlardan market yapıcı oluşumlarla beraber işlem alarak işlemlerimizi karda kapatabiliriz.
ALGORITHM | ALGORİTMA
ENG: With the help of this indicator, you can detect manipulation candles in the BIST exchange with the help of the algorithm I created by using volumetric data and wicks created by the price.
When there is excessive volatility in price movement, the algorithm in this indicator notices this price volatility and calculates a manipulation value by dividing it by the volatility value in past price movements.
TR: Bu indikatör yardımıyla hacimsel veriler ve fiyatın oluşturduğu fitillerden yararlanarak oluşturduğum algoritma yardımıyla siz de BİST borsasında manipülasyon mumlarını tespit edebilirsiniz.
Fiyat hareketinde aşırı derece oynaklık olduğunda bu indikatördeki algoritma bu fiyat oynaklığını fark eder ve geçmiş fiyat hareketlerindeki oylanklık degerine bölerek bize bir manipülasyon degeri hesaplar.
How does the indicator work? | Gösterge nasıl çalışır?
ENG: The manipulation candle does not give us information about the direction of price movement, it is only used as an auxiliary indicator.
TR: Manipülasyon mumu bize fiyat hareketinin yönü hakkında bilgi vermez sadece yardımcı bir gösterge olarak kullanılır.
ENG: We show our manipulation values as columns. We draw a channel over the values we show and we understand that there is manipulation in the candle of our values above this channel.
TR: Manipülasyon degerlerimiz kolonlar şeklinde gösteriyoruz. Gösterdiğimiz değerlerimizin üzerine bir kanal çizdiriyoruz ve bu kanalın üzerinde kalan değerlerimizdeki mumda manipülasyon yapıldığını anlıyoruz.
ENG: The indicator shows the manipulation value in the form of columns. Our manipulation value that goes outside the channel we have determined is colored red, within the channel it is colored yellow, and below the channel it is colored green. Red columns indicate candles that are manipulations.
TR: İndikatör manipülasyon degerini kolonlar şeklinde gösteriyor. Bizim belirlediğimiz kanal dışına çıkan manipülasyon degerimiz kırmızı, kanal içerisinde sarı, kanal altında yeşil olarak renklendiriliyor. Kırmızı kolonlar manipülasyon olan mumları göstermektedir.
Example | Örnek
ENG: In our example above, we see a manipulation candle that clears the price gaps, while the market maker clears the orders in the price gaps at the bottom to move the price up.
TR: Yukarıdaki örneğimizde oluşan fiyat boşluklarını temizleyen bir manipülasyon mumu görmekteyiz, alt kısımdaki fiyat boşluklarındaki emirleri temizleyen market maker fiyatı yukarı taşımak için buradaki emirleri temizliyor.
SETTINGS PANEL | AYARLAR PANELİ
ENG: We have only one setting in this indicator.
TR: Bu indikatörde tek ayarımız vardır.
ENG: Our multiplier value determines the width of the band value formed above our manipulation value. In the chart above, our multiplier value is 3.3. If we reduce our multiplier value, our manipulation sensitivity will decrease as there will be much more candles on the band.
TR: Çarpan değerimiz manipülasyon değerimizin üstünde oluşşan band değerinin genişliğini belirlemektedir.Yukarıdaki grafikte çarpan değerimiz 3.3, Eğer çarpan değerimizi azaltırsak band üstünde çok daha fazla mum olacağı için manipülasyon hassasiyetimiz azalacaktır.
ENG: When we set our multiplier value to 2.3, we have a more sensitive manipulation skin and it gives signals in more candles.
TR: Çarpan değerimizi 2.3 yapınca daha hassas manipülasyon derimiz oluyor ve daha fazla mumda sinyal veriyor.
If you have any ideas what to add to my work to add more sources or make calculations cooler, suggest in DM .
Fusion: Machine Learning SuiteThe Fusion: Machine Learning Suite combines multiple technical analysis dimensions and harnesses the predictive power of machine learning, seamlessly integrating a diverse array of classic and novel indicators to deliver precision, adaptability, and innovation.
Features and Capabilities
Multidimensional Analysis: Fusion: MLS integrates various technical analysis dimensions to offer a more comprehensive perspective.
Machine Learning Integration: Utilizing ML algorithms, Fusion: MLS offers adaptability to market changes.
Custom Indicators: Including dimensions like "Moon Lander", "Cap Line" and "Z-Pack" the indicator expands the scope of traditional technical analysis methods.
Tailored Customization: With customization options, Fusion: MLS allows traders to configure the tool to suit their specific strategies and market focus.
In the following sections, we'll explore the features and settings of Fusion: MLS in detail, providing insights into how it can be utilized.
Major Features and Settings
The indicator consists of several core components and settings, each designed to provide specific functionalities and insights. Here's an in-depth look:
Machine Learning Component
Distance Classifier: A Strategic Approach to Market Analysis
In the world of trading and investment, the ability to classify and predict price movements is paramount. Machine learning offers powerful tools for this purpose.
The Fusion: MLS indicator among others incorporates an Approximate Nearest Neighbors (ANN)* algorithm, a machine learning classification technique, and allows the selection of various distance functions .
This flexibility sets Fusion: MLS apart from existing solutions. The available distance functions include:
Euclidean: Standard distance metric, commonly used as a default.
Chebyshev: Also known as maximum value distance.
Manhattan: Sum of absolute differences.
Minkowski: Generalized metric that includes Euclidean and Manhattan as special cases.
Mahalanobis: Measures distance between points in a correlated space.
Lorentzian: Known for its robustness to outliers and noise.
*For a deeper understanding of the Approximate Nearest Neighbors (ANN) algorithm, traders are encouraged to refer to the relevant articles that can be found in the public domain.
Alternative scoring system
Fusion: MLS also includes a custom scoring alternative based on directional price action.
"Combined: Directional" and "Alpha: Directional" scoring types represent our own directional change algorithm, simple yet effective in displaying trend direction changes early on. They are visualized by color changes when scoring becomes below or above zero.
Changes in scoring quickly reflect shifts in buyer and seller sentiment.
Traders may choose signals by Color Change in the indicator settings to get alerts when scoring color shifts, not waiting until the histogram crosses the zero level.
Application in Trading
Machine learning classification has become an integral part of modern trading, offering innovative ways to analyze and interpret financial data.
Many algorithmic trading systems leverage ML classification to automate trading decisions. By continuously learning from real-time data, these systems can adapt to changing market conditions and execute trades with increased efficiency and accuracy.
ML classification allows for the development of tailored trading strategies as traders can select specific algorithms, dimensions, and filters that align with their trading style, goals, and the particular market they are operating.
We have integrated ML classification with traditional trading tools, such as moving averages and technical indicators. This fusion creates a more robust analysis framework, combining the strengths of classical techniques with the adaptability of machine learning.
Whether used independently or in conjunction with other tools, ML classification represents a significant advancement in trading technology, opening new avenues for exploration, innovation, and success in the financial world.
ML: Weighting System
The Fusion: MLS indicator introduces a unique weighting system that allows traders to customize the influence of various technical indicators in the machine learning process. This feature is not only innovative but also provides a level of control and adaptability that sets it apart from other indicators.
Customizable Weights
The weighting system allows users to assign specific weights to different indicators such as Moon Lander, RSI, MACD, Money Flow, Bollinger Bands, Cap Line, Z-Pack, Squeeze Momentum*, and MA Crossover. These weights can be adjusted manually, providing the ability to emphasize or de-emphasize specific indicators based on the trader's strategy or market conditions.
*Note, we determined via testing that the popular "Squeeze" indicator can actually be well replicated by simply using inputs of 15 & 199 in the bedrock indicator - MACD ; while we employed the standard "Squeeze" formula (developed by J. Carter ) in Fusion: MLS, traders are hereby made aware of our research findings regarding such.
The weighting system's importance lies in its ability to provide a more nuanced and personalized analysis. By adjusting the weights of different indicators a trader focusing on momentum strategies might assign higher weights to the Squeeze Momentum and MA Crossover indicators, while a trader looking for volatility might emphasize RSI and Bollinger Bands.
The ability to customize weights adds a layer of complexity and adaptability that is rare in standard machine-learning indicators.
Custom Indicators: Moon Lander
The "Moon Lander" is not just a catchy name; it's a robust feature inspired by principles from aerospace engineering and offers a unique perspective on trading analysis. Here's a conceptual overview:
Fast EMA and Kalman Matrix
"Moon Lander" incorporates both a Fast Exponential Moving Average (EMA) and a Kalman Matrix in its design. These two elements are combined to create a histogram, providing a specific approach to data analysis.
The Kalman Matrix, or Kalman Filter, is a mathematical concept used for estimating variables that can be measured indirectly and contain noise or uncertainty. It's a standard tool in machine learning and control systems, known for its ability to provide optimal estimates based on observed data.
Kalman Filter: A Navigational Tool
The Kalman filter, an essential part of "Moon Lander," is a mathematical concept known for its applications in navigation and control systems used by NASA in the apollo program :
Guidance in Uncertainty: Just as the Kalman filter helped guide complex aerospace missions through uncertain paths, it assists traders in navigating the often unpredictable financial markets.
Filtering Noise: In trading, the Kalman filter serves to filter out market noise, allowing traders to focus on the underlying trends.
Predictive Capabilities: Its ability to predict future states makes it a valuable tool for forecasting market movements and trend directions.
Custom Indicators: Cap Line and Z-Pack
Fusion: MLS integrates our additional proprietary custom indicators that have been published on TradingView earlier:
Cap Line: Delve into the specific functionalities and applications of our proprietary "Cap Line" indicator in the published description on TradingView.
Z-Pack: Explore the analytical perspectives, focused on the z-score methodology, and custom "Z-Pack" indicator by reviewing the published description on TradingView.
Buy/Sell Signal Generation Algorithms
Fusion: MLS offers various options for generating buy/sell signals, tailored to different trading strategies and perspectives:
Fusion: Allows traders to select any number of dimensions to receive buy/sell signals from, offering customized signal generation.
ML: Utilizes the machine learning ANN distance for signal generation.
Color Change: Generates signals by selected scoring type color change.
Displayed Dimension, Alpha Dimension: Generate signals based on specific selected dimensions.
These algorithms provide flexibility in determining buy/sell signals, catering to different trading styles and market conditions.
Filters
Filters are used to refine and selectively include or exclude signals based on specific criteria. Rather than generating signals, these filters act as gatekeepers, ensuring that only the signals meeting certain conditions are considered. Here's an overview of the filters used:
Dynamic State Predictor (DSP)
The DSP employs the Kalman Matrix to evaluate existing signals by comparing the fast and slow-moving averages, both processed through the Kalman Matrix. Based on the relationship between these averages, the DSP may exclude specific signals, depending on whether they align with upward or downward trends.
Average Directional Index (ADX)
The ADX filter evaluates the strength of existing trends and filters out signals that do not meet the specified ADX threshold and length, focusing on significant market movements.
Feature Engineering: RSI
Applies a filter to the existing signals, clearing out those that do not meet the criteria for RSI overbought or oversold threshold condition.
Feature Engineering: MACD
Assesses existing signals to identify changes in the strength, direction, momentum, and duration of a trend, filtering out those that do not align with MACD trend direction.
The Visual Component
The machine learning component is an internal component. However, the indicator also offers an equally important and useful visual component. It is a graphical representation of the multiple technical analysis dimensions, that can be combined in various ways (where the name "Fusion" comes from), allowing traders to visualize the underlying data and its analysis.
Displayed Dimension: Visualization and Normalization
The Fusion: MLS indicator offers a "Displayed Dimension" feature that visualizes various dimensions as a histogram. These dimensions may include RSI, MAs, BBs, MACD, etc.
RSI Dimension on the image + ML signals
Normalization: Each dimension is normalized. If any dimension has extreme values, a Fisher transformation is applied to bring them within a reasonable range.
Combined Dimension: When selecting the "Combined" option , the normalized values of the selected dimensions are combined using techniques such as standardization, normalization, or winsorization. This flexibility enables tailored visualization and analysis.
Alpha Dimension: Enhancing Analysis
The "Alpha Dimension" feature allows traders to select an additional dimension alongside the Displayed Dimension. This facilitates a combined analysis, enhancing the depth of insights.
Theme Selection
Fusion: MLS offers various themes such as "Sailfish", "Iceberg", "Moon", "Perl", "Candy" and "Monochrome" Traders can select a theme that resonates with their preference, enhancing visual appeal. There is also a "Custom" theme available that allows the user to choose the colors of the theme.
Customizing Fusion: MLS for Various Markets and Strategies
Fusion: MLS is designed with customization in mind. Traders can tailor the indicator to suit various markets and trading strategies. Selecting specific dimensions allows it to align with individual trading goals.
Selecting Dimensions: Choose the dimensions that resonate with your trading approach, whether focusing on trend-following, momentum, or other strategies.
Adjusting Parameters: Fine-tune the parameters of each dimension, including custom ones like "Moon Lander," to suit specific market conditions.
Theme Customization: Select a theme that aligns with your visual preferences, enhancing your chart's readability and appeal.
Utilizing Research: Leverage the underlying algorithms and research, such as machine learning classification by ANN and the Kalman filter, to deepen your understanding and application of Fusion: MLS.
Alerts
The indicator includes an alerting system that notifies traders when new buy or sell signals are detected.
Disclaimer
The information provided herein is intended for informational purposes only and should not be construed as investment advice, endorsement, nor a recommendation to buy or sell any financial instruments. Fusion: MLS is a technical analysis tool, and like all tools, it should be used with caution and in conjunction with other forms of analysis.
Traders and investors are encouraged to consult with a licensed financial professional and conduct their own research before making any trading or investment decisions. Past performance of the Fusion: MLS indicator or any trading strategy does not guarantee future results, and all trading involves risk. Users of Fusion: MLS should understand the underlying algorithms and assumptions and consider their individual risk tolerance and investment goals when using this tool.
AI Moving Average (Expo)█ Overview
The AI Moving Average indicator is a trading tool that uses an AI-based K-nearest neighbors (KNN) algorithm to analyze and interpret patterns in price data. It combines the logic of a traditional moving average with artificial intelligence, creating an adaptive and robust indicator that can identify strong trends and key market levels.
█ How It Works
The algorithm collects data points and applies a KNN-weighted approach to classify price movement as either bullish or bearish. For each data point, the algorithm checks if the price is above or below the calculated moving average. If the price is above the moving average, it's labeled as bullish (1), and if it's below, it's labeled as bearish (0). The K-Nearest Neighbors (KNN) is an instance-based learning algorithm used in classification and regression tasks. It works on a principle of voting, where a new data point is classified based on the majority label of its 'k' nearest neighbors.
The algorithm's use of a KNN-weighted approach adds a layer of intelligence to the traditional moving average analysis. By considering not just the price relative to a moving average but also taking into account the relationships and similarities between different data points, it offers a nuanced and robust classification of price movements.
This combination of data collection, labeling, and KNN-weighted classification turns the AI Moving Average (Expo) Indicator into a dynamic tool that can adapt to changing market conditions, making it suitable for various trading strategies and market environments.
█ How to Use
Dynamic Trend Recognition
The color-coded moving average line helps traders quickly identify market trends. Green represents bullish, red for bearish, and blue for neutrality.
Trend Strength
By adjusting certain settings within the AI Moving Average (Expo) Indicator, such as using a higher 'k' value and increasing the number of data points, traders can gain real-time insights into strong trends. A higher 'k' value makes the prediction model more resilient to noise, emphasizing pronounced trends, while more data points provide a comprehensive view of the market direction. Together, these adjustments enable the indicator to display only robust trends on the chart, allowing traders to focus exclusively on significant market movements and strong trends.
Key SR Levels
Traders can utilize the indicator to identify key support and resistance levels that are derived from the prevailing trend movement. The derived support and resistance levels are not just based on historical data but are dynamically adjusted with the current trend, making them highly responsive to market changes.
█ Settings
k (Neighbors): Number of neighbors in the KNN algorithm. Increasing 'k' makes predictions more resilient to noise but may decrease sensitivity to local variations.
n (DataPoints): Number of data points considered in AI analysis. This affects how the AI interprets patterns in the price data.
maType (Select MA): Type of moving average applied. Options allow for different smoothing techniques to emphasize or dampen aspects of price movement.
length: Length of the moving average. A greater length creates a smoother curve but might lag recent price changes.
dataToClassify: Source data for classifying price as bullish or bearish. It can be adjusted to consider different aspects of price information
dataForMovingAverage: Source data for calculating the moving average. Different selections may emphasize different aspects of price movement.
-----------------
Disclaimer
The information contained in my Scripts/Indicators/Ideas/Algos/Systems does not constitute financial advice or a solicitation to buy or sell any securities of any type. I will not accept liability for any loss or damage, including without limitation any loss of profit, which may arise directly or indirectly from the use of or reliance on such information.
All investments involve risk, and the past performance of a security, industry, sector, market, financial product, trading strategy, backtest, or individual's trading does not guarantee future results or returns. Investors are fully responsible for any investment decisions they make. Such decisions should be based solely on an evaluation of their financial circumstances, investment objectives, risk tolerance, and liquidity needs.
My Scripts/Indicators/Ideas/Algos/Systems are only for educational purposes!
Adaptivity: Measures of Dominant Cycles and Price Trend [Loxx]Adaptivity: Measures of Dominant Cycles and Price Trend is an indicator that outputs adaptive lengths using various methods for dominant cycle and price trend timeframe adaptivity. While the information output from this indicator might be useful for the average trader in one off circumstances, this indicator is really meant for those need a quick comparison of dynamic length outputs who wish to fine turn algorithms and/or create adaptive indicators.
This indicator compares adaptive output lengths of all publicly known adaptive measures. Additional adaptive measures will be added as they are discovered and made public.
The first released of this indicator includes 6 measures. An additional three measures will be added with updates. Please check back regularly for new measures.
Ehers:
Autocorrelation Periodogram
Band-pass
Instantaneous Cycle
Hilbert Transformer
Dual Differentiator
Phase Accumulation (future release)
Homodyne (future release)
Jurik:
Composite Fractal Behavior (CFB)
Adam White:
Veritical Horizontal Filter (VHF) (future release)
What is an adaptive cycle, and what is Ehlers Autocorrelation Periodogram Algorithm?
From his Ehlers' book Cycle Analytics for Traders Advanced Technical Trading Concepts by John F. Ehlers , 2013, page 135:
"Adaptive filters can have several different meanings. For example, Perry Kaufman's adaptive moving average (KAMA) and Tushar Chande's variable index dynamic average (VIDYA) adapt to changes in volatility . By definition, these filters are reactive to price changes, and therefore they close the barn door after the horse is gone.The adaptive filters discussed in this chapter are the familiar Stochastic , relative strength index (RSI), commodity channel index (CCI), and band-pass filter.The key parameter in each case is the look-back period used to calculate the indicator. This look-back period is commonly a fixed value. However, since the measured cycle period is changing, it makes sense to adapt these indicators to the measured cycle period. When tradable market cycles are observed, they tend to persist for a short while.Therefore, by tuning the indicators to the measure cycle period they are optimized for current conditions and can even have predictive characteristics.
The dominant cycle period is measured using the Autocorrelation Periodogram Algorithm. That dominant cycle dynamically sets the look-back period for the indicators. I employ my own streamlined computation for the indicators that provide smoother and easier to interpret outputs than traditional methods. Further, the indicator codes have been modified to remove the effects of spectral dilation.This basically creates a whole new set of indicators for your trading arsenal."
What is this Hilbert Transformer?
An analytic signal allows for time-variable parameters and is a generalization of the phasor concept, which is restricted to time-invariant amplitude, phase, and frequency. The analytic representation of a real-valued function or signal facilitates many mathematical manipulations of the signal. For example, computing the phase of a signal or the power in the wave is much simpler using analytic signals.
The Hilbert transformer is the technique to create an analytic signal from a real one. The conventional Hilbert transformer is theoretically an infinite-length FIR filter. Even when the filter length is truncated to a useful but finite length, the induced lag is far too large to make the transformer useful for trading.
From his Ehlers' book Cycle Analytics for Traders Advanced Technical Trading Concepts by John F. Ehlers , 2013, pages 186-187:
"I want to emphasize that the only reason for including this section is for completeness. Unless you are interested in research, I suggest you skip this section entirely. To further emphasize my point, do not use the code for trading. A vastly superior approach to compute the dominant cycle in the price data is the autocorrelation periodogram. The code is included because the reader may be able to capitalize on the algorithms in a way that I do not see. All the algorithms encapsulated in the code operate reasonably well on theoretical waveforms that have no noise component. My conjecture at this time is that the sample-to-sample noise simply swamps the computation of the rate change of phase, and therefore the resulting calculations to find the dominant cycle are basically worthless.The imaginary component of the Hilbert transformer cannot be smoothed as was done in the Hilbert transformer indicator because the smoothing destroys the orthogonality of the imaginary component."
What is the Dual Differentiator, a subset of Hilbert Transformer?
From his Ehlers' book Cycle Analytics for Traders Advanced Technical Trading Concepts by John F. Ehlers , 2013, page 187:
"The first algorithm to compute the dominant cycle is called the dual differentiator. In this case, the phase angle is computed from the analytic signal as the arctangent of the ratio of the imaginary component to the real component. Further, the angular frequency is defined as the rate change of phase. We can use these facts to derive the cycle period."
What is the Phase Accumulation, a subset of Hilbert Transformer?
From his Ehlers' book Cycle Analytics for Traders Advanced Technical Trading Concepts by John F. Ehlers , 2013, page 189:
"The next algorithm to compute the dominant cycle is the phase accumulation method. The phase accumulation method of computing the dominant cycle is perhaps the easiest to comprehend. In this technique, we measure the phase at each sample by taking the arctangent of the ratio of the quadrature component to the in-phase component. A delta phase is generated by taking the difference of the phase between successive samples. At each sample we can then look backwards, adding up the delta phases.When the sum of the delta phases reaches 360 degrees, we must have passed through one full cycle, on average.The process is repeated for each new sample.
The phase accumulation method of cycle measurement always uses one full cycle's worth of historical data.This is both an advantage and a disadvantage.The advantage is the lag in obtaining the answer scales directly with the cycle period.That is, the measurement of a short cycle period has less lag than the measurement of a longer cycle period. However, the number of samples used in making the measurement means the averaging period is variable with cycle period. longer averaging reduces the noise level compared to the signal.Therefore, shorter cycle periods necessarily have a higher out- put signal-to-noise ratio."
What is the Homodyne, a subset of Hilbert Transformer?
From his Ehlers' book Cycle Analytics for Traders Advanced Technical Trading Concepts by John F. Ehlers , 2013, page 192:
"The third algorithm for computing the dominant cycle is the homodyne approach. Homodyne means the signal is multiplied by itself. More precisely, we want to multiply the signal of the current bar with the complex value of the signal one bar ago. The complex conjugate is, by definition, a complex number whose sign of the imaginary component has been reversed."
What is the Instantaneous Cycle?
The Instantaneous Cycle Period Measurement was authored by John Ehlers; it is built upon his Hilbert Transform Indicator.
From his Ehlers' book Cybernetic Analysis for Stocks and Futures: Cutting-Edge DSP Technology to Improve Your Trading by John F. Ehlers, 2004, page 107:
"It is obvious that cycles exist in the market. They can be found on any chart by the most casual observer. What is not so clear is how to identify those cycles in real time and how to take advantage of their existence. When Welles Wilder first introduced the relative strength index (rsi), I was curious as to why he selected 14 bars as the basis of his calculations. I reasoned that if i knew the correct market conditions, then i could make indicators such as the rsi adaptive to those conditions. Cycles were the answer. I knew cycles could be measured. Once i had the cyclic measurement, a host of automatically adaptive indicators could follow.
Measurement of market cycles is not easy. The signal-to-noise ratio is often very low, making measurement difficult even using a good measurement technique. Additionally, the measurements theoretically involve simultaneously solving a triple infinity of parameter values. The parameters required for the general solutions were frequency, amplitude, and phase. Some standard engineering tools, like fast fourier transforms (ffs), are simply not appropriate for measuring market cycles because ffts cannot simultaneously meet the stationarity constraints and produce results with reasonable resolution. Therefore i introduced maximum entropy spectral analysis (mesa) for the measurement of market cycles. This approach, originally developed to interpret seismographic information for oil exploration, produces high-resolution outputs with an exceptionally short amount of information. A short data length improves the probability of having nearly stationary data. Stationary data means that frequency and amplitude are constant over the length of the data. I noticed over the years that the cycles were ephemeral. Their periods would be continuously increasing and decreasing. Their amplitudes also were changing, giving variable signal-to-noise ratio conditions. Although all this is going on with the cyclic components, the enduring characteristic is that generally only one tradable cycle at a time is present for the data set being used. I prefer the term dominant cycle to denote that one component. The assumption that there is only one cycle in the data collapses the difficulty of the measurement process dramatically."
What is the Band-pass Cycle?
From his Ehlers' book Cycle Analytics for Traders Advanced Technical Trading Concepts by John F. Ehlers , 2013, page 47:
"Perhaps the least appreciated and most underutilized filter in technical analysis is the band-pass filter. The band-pass filter simultaneously diminishes the amplitude at low frequencies, qualifying it as a detrender, and diminishes the amplitude at high frequencies, qualifying it as a data smoother. It passes only those frequency components from input to output in which the trader is interested. The filtering produced by a band-pass filter is superior because the rejection in the stop bands is related to its bandwidth. The degree of rejection of undesired frequency components is called selectivity. The band-stop filter is the dual of the band-pass filter. It rejects a band of frequency components as a notch at the output and passes all other frequency components virtually unattenuated. Since the bandwidth of the deep rejection in the notch is relatively narrow and since the spectrum of market cycles is relatively broad due to systemic noise, the band-stop filter has little application in trading."
From his Ehlers' book Cycle Analytics for Traders Advanced Technical Trading Concepts by John F. Ehlers , 2013, page 59:
"The band-pass filter can be used as a relatively simple measurement of the dominant cycle. A cycle is complete when the waveform crosses zero two times from the last zero crossing. Therefore, each successive zero crossing of the indicator marks a half cycle period. We can establish the dominant cycle period as twice the spacing between successive zero crossings."
What is Composite Fractal Behavior (CFB)?
All around you mechanisms adjust themselves to their environment. From simple thermostats that react to air temperature to computer chips in modern cars that respond to changes in engine temperature, r.p.m.'s, torque, and throttle position. It was only a matter of time before fast desktop computers applied the mathematics of self-adjustment to systems that trade the financial markets.
Unlike basic systems with fixed formulas, an adaptive system adjusts its own equations. For example, start with a basic channel breakout system that uses the highest closing price of the last N bars as a threshold for detecting breakouts on the up side. An adaptive and improved version of this system would adjust N according to market conditions, such as momentum, price volatility or acceleration.
Since many systems are based directly or indirectly on cycles, another useful measure of market condition is the periodic length of a price chart's dominant cycle, (DC), that cycle with the greatest influence on price action.
The utility of this new DC measure was noted by author Murray Ruggiero in the January '96 issue of Futures Magazine. In it. Mr. Ruggiero used it to adaptive adjust the value of N in a channel breakout system. He then simulated trading 15 years of D-Mark futures in order to compare its performance to a similar system that had a fixed optimal value of N. The adaptive version produced 20% more profit!
This DC index utilized the popular MESA algorithm (a formulation by John Ehlers adapted from Burg's maximum entropy algorithm, MEM). Unfortunately, the DC approach is problematic when the market has no real dominant cycle momentum, because the mathematics will produce a value whether or not one actually exists! Therefore, we developed a proprietary indicator that does not presuppose the presence of market cycles. It's called CFB (Composite Fractal Behavior) and it works well whether or not the market is cyclic.
CFB examines price action for a particular fractal pattern, categorizes them by size, and then outputs a composite fractal size index. This index is smooth, timely and accurate
Essentially, CFB reveals the length of the market's trending action time frame. Long trending activity produces a large CFB index and short choppy action produces a small index value. Investors have found many applications for CFB which involve scaling other existing technical indicators adaptively, on a bar-to-bar basis.
What is VHF Adaptive Cycle?
Vertical Horizontal Filter (VHF) was created by Adam White to identify trending and ranging markets. VHF measures the level of trend activity, similar to ADX DI. Vertical Horizontal Filter does not, itself, generate trading signals, but determines whether signals are taken from trend or momentum indicators. Using this trend information, one is then able to derive an average cycle length.