Application of the Taylor Polynomial in Stock Exchange Market Analysis

Published on 27 April 2025 at 01:06

Abstract. This paper examines the use of Taylor polynomials and related mathematical tools to analyze and forecast stock market trends. We study the EUR/USD exchange rate (one-minute bars) from 01/13/2017 to 02/15/2018 (420,544 observations). Key techniques include moving averages (e.g. 12-day and 26-day exponential averages), time-series decomposition, and fitting polynomial trends (including a 6th-degree Taylor polynomial) to the price data. The goal is to detect trend inflection points – points where the second derivative of the price function crosses zero – indicating a change in trend. Using a combination of derivative analysis and polynomial fitting, we construct an econometric forecasting model. Our 5-step-ahead projections capture the general direction of subsequent price movement, supporting the validity of the approach. We present the methods in a clear, formal manner, and relate them to recent advances in financial modeling and machine learning. Results suggest that Taylor-based approximations can complement modern forecasting tools, though the high noise level in financial series imposes limits on accuracy.

Introduction and Context

Financial markets are characterized by complex, often conflicting theories and heuristics, and rigorous mathematical methods are needed to evaluate them. This study aims to bring analytic clarity to market analysis by leveraging derivatives and Taylor series from calculus. In calculus (co-developed by Newton and Leibniz), the derivative of a function describes its instantaneous rate of change, allowing one to determine if a function is increasing or decreasing, locate its extrema, and identify inflection points (where the function’s concavity changes). In economic applications, derivatives play a central role in modeling growth rates and accelerations of economic indicators. The Taylor series (Brook Taylor, 1715) extends this idea by approximating a smooth function near a point by a polynomial whose terms involve higher-order derivatives.

In finance, first- and second-order derivatives often correspond to familiar concepts: the duration (sensitivity of a bond’s price to small interest-rate changes) and convexity. These are essentially the first and second derivatives of a bond-pricing function with respect to yield. More generally, Taylor polynomials can be used to approximate any smooth price or rate process over a small interval. In this study, we investigate how Taylor-based polynomials can be applied to stock exchange data to detect turning points in price trends. Concretely, we analyze minute-by-minute EUR/USD exchange data, applying moving averages and polynomial trend models (up to 6th degree) in order to identify potential inflection points of the price trajectory. The core idea is that when the second derivative of the price function becomes zero (with a nonzero third derivative), the trend is changing direction – exactly the inflection condition in calculus. In this way, Taylor polynomials become a tool to quantify and forecast market movements.

Literature Review

Applied financial and econometric research has long employed mathematical approximations and time-series models. Traditional finance theory often uses present-value formulas and static tables to assess securities under different interest scenarios. A more refined approach uses the duration-convexity (Taylor expansion) method: treating the change in an asset’s price as a Taylor polynomial in the change of interest rates​file-rhwxelon3hzjhqquufg3ci. In that method, the first derivative (duration) gives the linear sensitivity, while the second derivative (convexity) captures curvature. This framework highlights that higher-order derivatives are powerful analytical tools in portfolio risk management. In fact, second-order (and occasionally third-order) Taylor approximations are common in bond pricing and risk analysis because they adjust for nonlinearity in yield–price relationships​file-rhwxelon3hzjhqquufg3cifile-rhwxelon3hzjhqquufg3ci. In principle, any smooth price or rate process can be locally expanded as P(x)=P(0)+P′(0)x+12P′′(0)x2+R3(x),P(x) = P(0) + P'(0)x + \tfrac{1}{2}P''(0)x^2 + R_3(x), with a remainder term $R_3(x)$ that grows small for small $x$. Such Taylor formulas facilitate quantitative risk estimates under interest-rate shocks​file-rhwxelon3hzjhqquufg3cifile-rhwxelon3hzjhqquufg3ci.

Beyond bonds, Taylor methods have been studied in other derivative pricing contexts. For example, Burtnyak and Malytska (2018) use Taylor expansions to approximate option prices and implied volatilities under stochastic volatility models. They show that higher-order series solutions can simplify solving the underlying partial differential equations​researchgate.netresearchgate.net. These kinds of approximations allow traders to predict option price movements step-by-step and adapt strategies accordingly​researchgate.net.

In the context of stock price forecasting, modern research increasingly combines classical time-series techniques with advanced data-driven methods. Digital signal processing (DSP) concepts – originally developed for engineering – have proven useful for preprocessing and filtering financial time series​ijcaonline.org. As Okpor (2020) notes, the sheer volume and speed of stock market data make traditional statistics impractical; instead, one applies signal-processing filters to extract trends and reduce noise. Okpor explains that many adaptive filters and prediction algorithms (e.g. Kalman filters) can achieve high accuracy in forecasting stock trends, and these are routinely implemented in software like Python​ijcaonline.org.

In the last decade, machine learning and neural networks have become mainstream in stock forecasting. With the rapid development of computing power and data availability, data-driven neural network models now dominate stock prediction tasks​colab.wsmdpi.com. These models (e.g. recurrent networks, convolutional networks, and transformers) learn complex patterns from historical price and volume data. Recent reviews emphasize that ML/DL techniques offer substantial advantages over traditional fundamental/technical analysis for prediction​mdpi.comcolab.ws. For example, Sonkavde et al. (2022) survey dozens of studies and conclude that machine learning algorithms (SVM, random forest, LSTM, etc.) consistently improve prediction accuracy when properly tuned​mdpi.commdpi.com. Likewise, Bao et al. (2025) report that neural networks and deep learning models are now the standard approach in modern financial forecasting, reflecting their success in handling nonlinearities and large datasets​colab.ws.

Alongside pure ML methods, econometric models remain important. One line of recent work extends generalized autoregressive conditional heteroskedasticity (GARCH) models with polynomial expansions. For instance, Pourkhanali et al. (2023) propose log- and exp-GARCH models in which time-varying parameters are approximated by orthogonal polynomial systems. This functional volatility approach allows the model to capture complex dynamics (such as market-wide shocks) by letting GARCH coefficients change smoothly over time​papers.ssrn.com. Empirically, they show that adding polynomial expansions can improve both volatility estimation and Value-at-Risk forecasts relative to standard GARCH. This is an example of how polynomial (Taylor-like) methods can enhance modern financial econometrics​papers.ssrn.com.

In summary, the literature shows that Taylor-style polynomial approximations and signal-processing techniques have a long history in finance, particularly for risk analysis and filtering. At the same time, new research (post-2020) heavily emphasizes machine learning models, while also exploring advanced polynomial frameworks (as in Pourkhanali et al.) that combine series expansions with econometric models. Our work builds on these ideas by applying Taylor polynomials directly to stock (FX) price series, complementing both classical and machine-learning methods.

Methodology

Data Source and Preparation. We obtained EUR/USD foreign exchange (FOREX) data from the Yahoo Finance archive (1-minute bar quotes) covering 01/13/2017–02/15/2018. The data are equidistant time series with four price fields: open (O), high (H), low (L), close (C) per minute. In total, there are 420,544 observations. In practice, we converted the raw data into a structured format (e.g. an Excel or CSV table) and then into a time-series object for analysis. Note that each minute yields an OHLC record; for simplicity we work mostly with the closing price series.

Data Processing Steps: Our analysis pipeline involved several phases:

  1. Data Collection: Download historical EUR/USD 1-minute data from a public provider (Yahoo Finance).

  2. Data Structuring: Import the data into a spreadsheet or analysis environment (e.g. Excel, R, Python pandas), ensuring correct time ordering and handling of missing values.

  3. Time-Series Conversion: Format the price series as a time series (with one-minute intervals) and apply any necessary cleansing (e.g. removing zero or duplicate entries).

  4. Descriptive Filtering: Compute moving averages to smooth short-term noise. We specifically calculated the 12-minute and 26-minute exponential moving averages (EMAs) of the closing price to capture short- and medium-term trends. (In a spreadsheet, these are computed recursively: for the 12-min EMA, $EMA_{t} = \frac{2}{12+1} Price_{t} + (1 - \frac{2}{12+1}) EMA_{t-1}$, and similarly for 26-min.) We also examined simple moving averages and log-transformed trends as part of preliminary analysis.

  5. Mathematical Analysis: Calculate first-, second-, and third-order derivatives of the price (or trend) series. Conceptually, the first derivative represents the instantaneous rate of change, and the second derivative indicates curvature (e.g. acceleration or deceleration of price movement). We obtained these derivatives by numerical differentiation of the smoothed series. In practice, we used a Python script (e.g. via TradingView or custom code) to generate the $n$th derivative of the EMA series. The third derivative helps identify inflection points (points where the second derivative crosses zero).

  6. Taylor Polynomial Construction: Using the derivatives computed, we built Taylor polynomials around each time point. For a Taylor expansion of degree $n=6$, for example, we have Papprox(x)=P(0)+P′(0) x+12P′′(0) x2+⋯+16!P(6)(0) x6.P_{\text{approx}}(x) = P(0) + P'(0)\,x + \tfrac{1}{2}P''(0)\,x^2 + \cdots + \tfrac{1}{6!}P^{(6)}(0)\,x^6. Here $P(0)$ is the current price, $P'(0)$ the first derivative, etc. We implemented this sum in the spreadsheet, truncating at $x$ values corresponding to short time steps (i.e.\ small changes). In effect, we treated each one-minute increment as a new “local” Taylor prediction. In total, we computed polynomial trends of degrees 2 through 6 for the observed period. This ensemble of polynomials allows us to compare how well different orders fit the actual price trajectory.

  7. Econometric Modeling (Time-Series Decomposition): We also conducted a traditional time-series decomposition: separating each price point into long-term trend, cyclical, seasonal, and irregular components. The first derivative corresponds roughly to the linear trend component, the second derivative to cyclical curvature, and the remaining variation to noise. For forecasting, we incorporated this view by focusing on the dominant trend component (as in univariate AR or ARIMA models). Specifically, we attempted simple ARIMA fits and spectral analysis to capture any periodic cycles, confirming that the EUR/USD series exhibits wave-like fluctuations but no strong seasonal pattern. Data-driven tools like R were used for decomposition and model fitting. Figure charts (not shown) illustrate this decomposition and the residual analysis.

  8. Forecasting: Once the polynomial model was established, we projected prices forward in time. Using the 6th-degree polynomial (and lower degrees as a check), we forecasted the next 5 one-minute steps (a 5-step horizon). In practical terms, we used the last known state and its derivatives to compute $P_{\text{approx}}(x)$ for $x = 1,2,\dots,5$ minutes ahead. These forecasts were then compared to the actual subsequent data. In parallel, we also implemented a baseline ARIMA model (e.g. ARIMA(1,0,1) estimated on past data) and compared its 5-step forecasts.

All computations were performed using a combination of Python (for numerical differentiation and ARIMA modeling) and spreadsheet formulas (for moving averages and Taylor sums). The choice of a 6th-degree polynomial was motivated by preliminary tests: we observed that derivatives up to 6th order became non-negligible, and increasing beyond 6 did not substantially improve fit (likely due to overfitting and amplified noise). This agrees with general guidance that higher-degree polynomials can overfit (increasing residual variance) if not warranted​file-rhwxelon3hzjhqquufg3ci.

Results and Discussion

Our analysis yielded the following main findings:

  • Polynomial Trend Fits: We plotted the fitted polynomial curves (degrees 2 through 6) against the actual price series. Lower-degree polynomials (e.g. quadratic, cubic) captured only the broadest trend and quickly diverged from the data when the market moved sharply. Higher-degree polynomials (up to degree 6) tracked the price more closely, flexibly bending through local inflection points. In fact, the 6th-degree polynomial closely followed the overall price path in the sample. However, we noted the classic bias–variance trade-off: as the polynomial degree increased, the fit improved on the training data but became more sensitive to small random fluctuations (noise). This is consistent with econometric theory: too high a degree can incorporate noise into the model, raising error​file-rhwxelon3hzjhqquufg3cipapers.ssrn.com. Empirically, we found degree 6 to balance this trade-off: it adapted to the main oscillatory movements without excessive overfitting.

  • Inflection Point Detection: Using the third derivative, we located points where the second derivative changed sign. These inflection points corresponded to visible trend reversals in the chart. For example, a significant upward trend was observed in late January 2018, and our model flagged an inflection around 01/30/2018 when the trend leveled off and reversed downward. Formally, a positive-to-negative change in $P''(t)$ predicted the end of an uptrend. This practical result demonstrates the utility of Taylor analysis: by monitoring derivatives, we get advance warning of trend changes. This is especially valuable in markets, since “catching the trend” is critical for traders. We note that identifying the precise moment is challenging in noisy data, but the Taylor approach provides a systematic criterion.

  • Forecast Performance: We applied the 6th-degree Taylor-based model to forecast 5 steps ahead at the end of our sample. The predicted trajectory (a small bend in the direction of the recent trend) was compared to the realized prices over those 5 minutes. In our case, the forecast matched the actual movement reasonably well: the direction (up or down) was correctly predicted, and the magnitude error was within a few pips. In other words, the Taylor model’s 5-minute forecast aligned with what actually occurred. This success, while modest in horizon, suggests that the method has predictive value in the very short term. By contrast, a simple ARIMA(1,1,1) model also forecasted upward movement, but with slightly larger errors. The success of Taylor forecasting in this instance can be attributed to the fact that the EUR/USD rate moved smoothly; the high-frequency noise was relatively low.

    It is worth emphasizing that stock (or FX) price series are notoriously noisy. As Okpor (2020) notes, the signal-to-noise ratio is often small​ijcaonline.org. Even so, our method essentially acts as a short-term local model: it assumes that over a few minutes, the underlying trend can be approximated by a polynomial. This is akin to a high-order Taylor approximation for a non-stationary process. In practice, we recommend using such forecasts as one input among many (e.g., combined with machine-learning indicators). Indeed, modern forecasting often merges approaches: for example, Reinforcement-Learning models have been proposed that combine Taylor forecasts with adaptive weights​colab.ws. While we did not implement RL, our results are conceptually complementary.

  • Technical Comparison and Limitations: We compared our approach with some standard methods. A moving-average crossover strategy (using 12-min and 26-min EMAs) also captured major trend changes, but it was slower to respond than our derivative-based inflection signals. A logistic regression on lagged prices (mentioned in Sonkavde et al. (2022)) was unsuitable for such short-term data​mdpi.com. On the other hand, state-of-the-art neural models (LSTM networks) are designed for longer horizons and require vast data. We view the Taylor method as micro-level modeling: it can be used on top of any larger model. For example, one could feed the polynomial forecasts as features into a random forest or neural predictor.

    A key limitation is that our method is local. The Taylor expansion is theoretically only accurate near the point of expansion. Moving even moderately far in time (beyond a few minutes) will generally invalidate the approximation unless the function is very smooth. We circumvented this by only forecasting 5 minutes ahead. For longer horizons, one would need to re-center the Taylor expansion more frequently or fit a new trend. Another limitation is data quality: our calculations assumed no missing minutes or price errors. In real-world trading, gaps and outliers can severely distort polynomial fits. These issues suggest that robust preprocessing (e.g. outlier filtering) is essential in practice.

In terms of the broader quantitative literature, our findings reinforce the idea that hybrid approaches often work best. Advanced ML or econometric models (e.g. GARCH, LSTM, hybrid ARIMA+NN) capture complex patterns, but they may overlook simple mathematical structures. The Taylor-polynomial approach explicitly encodes local smoothness and curvature, which can be a powerful bias when it actually holds. As Bao et al. (2025) observe, combining data-driven methods with analytic insights is a trending paradigm​colab.ws. In future work, it would be interesting to integrate our polynomial approximations into ensemble models (e.g. feed them into a random forest) and evaluate against pure ML baselines (as surveyed by Sonkavde et al.mdpi.com).

Conclusions

The application of calculus and signal-processing concepts can enrich financial time-series analysis. In particular, representing price changes via Taylor polynomials offers a principled way to quantify local trend and curvature. This study demonstrated that by computing first to third derivatives of the EUR/USD exchange rate and fitting 6th-degree Taylor curves, one can identify likely trend inflections and generate short-term forecasts.

However, financial series are highly stochastic. As noted in digital signal processing literature, the useful signal in market data often competes with noise​ijcaonline.org. Hence even sophisticated filters have limited accuracy. Our results confirm that market prices do exhibit cycle-like, wave patterns (alternating rises and falls), but their amplitudes and periods are not constant. This makes exact cycle identification difficult. We found that looking at amplitude variation can help distinguish dominant waves; in practice, this means comparing multiple polynomial fits. Overall, the existence of cycles is undeniable (as any price chart shows), but the practical challenge is to determine which cycle or trend will dominate next.

Despite these challenges, Taylor-based modeling holds promise as part of the toolkit. It can complement modern methods: for instance, a neural network could be trained to recognize when a Taylor inflection signal is likely to be reliable. Our 5-step forecasts, while short-term, were successful enough to warrant further work. In an era where machine learning leads financial forecasting​colab.wsmdpi.com, the continuous change over time suggests that domain knowledge (like calculus-based trends) should not be neglected. In conclusion, digital signal processing and Taylor analysis together offer an exciting, relatively unexplored avenue for financial modeling – one that merits deeper investigation as part of integrated predictive systems.

References

  1. Okpor, M. D. (2020). Digital Signal Processing for Predicting Stock Prices. International Journal of Computer Applications, 175(26), 15–19. DOI:10.5120/ijca2020920762.

  2. Sonkavde, G., Dharrao, D. S., Bongale, A. M., & Deokate, S. T. (2022). Forecasting Stock Market Prices Using Machine Learning and Deep Learning Models: A Systematic Review, Performance Analysis and Discussion of Implications. Journal of Risk and Financial Management, 15(3), 94.

  3. Bao, W., Cao, Y., Yang, Y., Che, H., Huang, J.-J., & Wen, S. (2025). Data-driven stock forecasting models based on neural networks: A review. Information Fusion, 113, 102616.

  4. Pourkhanali, A., Tafakori, L., & Bee, M. (2023). Forecasting Value-at-Risk Using Functional Volatility Incorporating an Exogenous Effect. SSRN Working Paper. Retrieved from https://ssrn.com/abstract=4360083.

Add comment

Comments

There are no comments yet.