Technical Documentation · v1.0

Global Market
Tracker Pro

A complete methodology guide covering data acquisition, logarithmic return computation, missing-data treatment, and Monte Carlo Value at Risk analysis.

Susovon Jana, Ph.D.Author
Yahoo FinanceData Source
Dlog ReturnsMethodology
Monte CarloVaR Engine
↗ Visit Author's Website
§ 01 — Overview

What This System Does

An end-to-end pipeline from raw price data to actionable risk figures — running entirely on the edge.

The Global Market Tracker is a real-time financial analytics platform that fetches daily closing prices for major global equity indices, computes logarithmic returns, handles gaps in the data, and provides a Monte Carlo simulation-based Value at Risk (VaR) engine — all served through a serverless Cloudflare Worker architecture.

Yahoo Finance
Price data
Cloudflare Worker
Edge compute
KV Store
Cache layer
Dashboard
Prices & Returns
VaR Engine
Monte Carlo

The platform is designed around three analytical modules: Market Trends (visualise prices and returns across any timeframe), Risk / VaR (quantify tail risk for a custom portfolio exposure), and a Data Export facility that lets researchers download raw CSVs directly.

Live Architecture

All data is served from a Cloudflare Worker endpoint at workers.dev. The Worker fetches from Yahoo Finance on a scheduled cron and caches results in Cloudflare KV — meaning the dashboard always loads in milliseconds with no cold-start penalty.

§ 02 — Data Collection

Fetching from Yahoo Finance

How raw market data is acquired, normalised, and stored.

Yahoo Finance is used as the primary data provider due to its comprehensive coverage of global equity indices, zero-cost API access, and daily-updated closing prices. Data is fetched programmatically using the yfinance library inside a Cloudflare Worker's scheduled handler.

01
Ticker Resolution
Each market is identified by its Yahoo Finance ticker symbol — for example ^NSEI for the NIFTY 50, ^GSPC for the S&P 500, ^DJI for the Dow Jones, and so on. The worker maintains a curated dictionary mapping human-readable market names to their respective tickers.
02
Scheduled Cron Trigger
Cloudflare Cron Triggers run the data pipeline automatically at a configured interval (typically once per day after market close). This ensures the KV store always contains the most recent available data without requiring any manual intervention.
03
Historical Download via yfinance
The Worker requests historical daily closing prices (the Adj Close column) for all tracked tickers over the full available history. Yahoo Finance returns data in chronological order indexed by trading date.
04
Date Alignment & Pivoting
Since different markets trade on different holiday calendars, the raw data is pivoted into a unified date-indexed table. Every row represents a single calendar date; every column represents one index. Dates where a market was closed naturally produce a null/missing value — these are handled in the returns calculation step described in §04.
05
KV Storage & API Exposure
The processed price array (as JSON) is stored in Cloudflare KV under two keys: one for raw prices and one for pre-computed returns. The Worker exposes these via two REST endpoints — /api/prices and /api/returns — which the dashboard fetches on load.
Endpoint Response Shape Purpose
/api/prices Array of {Date, Market₁, Market₂, …} Daily closing price for each tracked index
/api/returns Array of {Date, Market₁, Market₂, …} Pre-computed log-returns (%) with forward-fill applied
/api/admin/refresh {status, message} Force pipeline run (admin-authenticated POST)
Data Freshness Indicator

The dashboard calculates the difference between today's date and the last available record date. If data is 0–4 days old it shows a green "live" badge; anything older triggers an amber "stale" warning, alerting the user that the cron job may have missed a run.

§ 03 — Return Computation

Logarithmic (Dlog) Returns

Why log-returns and how they are computed in percentage terms.

Simple arithmetic returns measure the proportional change in price, but they are not time-additive — you cannot sum daily returns to obtain the multi-period return. Logarithmic returns (also called continuously-compounded or dlog returns) solve this: they are additive across time, approximately normally distributed, and symmetric with respect to gains and losses.

Core Formula — Logarithmic Return
rt = ln(Pt / Pt−1) × 100
rt = daily log-return on day t, expressed in percent
Pt = adjusted closing price on day t
Pt−1 = adjusted closing price on the previous available trading day
ln(·) = natural logarithm

The multiplication by 100 converts the unitless log-ratio into a percentage figure — the value stored in /api/returns and displayed on the dashboard. When the VaR engine reads these returns it divides by 100 to convert back to decimal form before computing statistics.

Why Not Simple Returns?

Simple returns r = (Pt − Pt−1) / Pt−1 are asymmetric: a 50% gain followed by a 50% loss does not return to the origin. Log-returns have no such asymmetry, making them the standard choice in quantitative finance for volatility modelling and VaR estimation.

🔢 Live Return Calculator
// dlog_return.js · try it yourself
Log Return
ln(Pt/Pt-1) × 100
Simple Return
(Pt − Pt-1) / Pt-1 × 100
// app.js — how the dashboard reads pre-computed log-returns from the API const filteredReturns = globalData.returns.filter(r => r.Date >= startDate && r.Date <= endDate ); // VaR engine: strips % sign, converts back to decimal let vals = returnsData.map(r => r[market]); vals = vals.filter(v => typeof v === 'number' && !isNaN(v)) .map(v => v / 100); // percent → decimal
§ 04 — Missing Data Handling

Forward-Fill in Return Calculation

Markets close on different holidays. Here is how gaps are bridged without distorting the return series.

Because global equity markets observe different national holidays, it is common for one index to have a trading day where another does not. When the unified date-indexed table is constructed (§02), these gaps appear as null or NaN values in the price columns.

The rule applied in this system is: when a price is missing on day t, the most recently available price is carried forward (last-observation-carried-forward, or LOCF). The consequence for the return series is that the computed log-return for that gap day is exactly 0% — the market is treated as if it did not move on its closed day, which is economically the most neutral assumption.

Missing Data Rule
If Pt is NaN → Pt := Pt−1
∴ rt = ln(Pt−1 / Pt−1) × 100 = 0%
The missing price is replaced by the previous available price before the log-return is computed.
The resulting return is identically zero — no phantom gain or loss is introduced.

The diagram below illustrates the process. Gold bars indicate days where the forward-fill was applied; teal bars are real trading days:

Real Trading Day Forward-Filled (Market Closed → Return = 0%)
Why not interpolation or deletion?

Linear interpolation would invent prices that never existed, distorting volatility. Deleting rows would misalign cross-market comparisons. LOCF is the industry-standard approach: it preserves the time-series length, keeps all markets aligned on the same date spine, and introduces a return of exactly 0% — the correct value for a day on which no trading occurred.

In the VaR calculation, after the forward-filled return series is loaded, an additional filter removes any remaining NaN or non-numeric entries before statistics are computed:

vals = vals.filter(v => typeof v === 'number' && !isNaN(v)); // Ensures statistical functions (mean, std-dev) only see valid numbers
§ 05 — Value at Risk

Monte Carlo VaR Engine

From a clean return series to a probabilistic loss estimate — in three mathematical steps.

Value at Risk (VaR) answers a single question: "What is the maximum amount I could lose over a given period, with a specified level of confidence?" The dashboard implements a parametric Monte Carlo approach — estimating drift and volatility from historical data, then simulating thousands of future paths to build an empirical return distribution.

— — —

Step 1 — Estimate Drift & Volatility

From the filtered historical return series, two statistics are computed:

Drift (μ) & Volatility (σ)
μ = mean(r₁, r₂, …, rₙ) × h
σ = stddev(r₁, …, rₙ) × √h
h = time horizon in trading days (1 day, 5 days, 21 days, …)
n = number of historical observations (controlled by lookback window)
The square-root-of-time rule scales daily volatility to the chosen horizon under the assumption of i.i.d. returns.
const getMean = arr => arr.reduce((a, b) => a + b, 0) / arr.length; const getStdDev = (arr, mu) => Math.sqrt(arr.reduce((a, b) => a + (b - mu) ** 2, 0) / (arr.length - 1)); const mu = getMean(vals) * horizon; const sigma = getStdDev(vals, getMean(vals)) * Math.sqrt(horizon);
— — —

Step 2 — Simulate Random Paths

Using the estimated μ and σ, the engine draws N random returns from a normal distribution using the Box-Muller transform — the industry-standard method for generating Gaussian random numbers in JavaScript:

Box-Muller Transform
Z = √(−2 ln U₁) · cos(2π U₂) · σ + μ
U₁, U₂ = independent uniform random draws on (0,1)
Z = a single simulated horizon return
Repeated N times (default 10,000; up to 250,000) to build the distribution.
function randomNormal(mu, sigma) { let u = 0, v = 0; while (u === 0) u = Math.random(); while (v === 0) v = Math.random(); return (Math.sqrt(-2 * Math.log(u)) * Math.cos(2 * Math.PI * v)) * sigma + mu; } const simReturns = Array.from({ length: sims }, () => randomNormal(mu, sigma)) .sort((a, b) => a - b); // sort ascending for quantile lookup
— — —

Step 3 — Extract the VaR Quantile

The sorted simulation array is indexed at the (1 − confidence level) percentile. The return at that index is the VaR threshold — the worst expected outcome at the chosen confidence:

VaR Quantile & Rupee Amount
VaR% = simReturns[⌊N × (1 − c)⌋]
VaR₹ = Portfolio × |VaR%|
c = confidence level (0.90, 0.95, or 0.99)
N = number of simulated paths
|·| = absolute value — VaR is always expressed as a positive loss figure
The colour of the result turns amber above 5% loss and red above 10%.
const varPct = simReturns[Math.floor(sims * (1 - confidence / 100))]; const varAmount = amount * Math.abs(varPct); // Risk colour coding const risk = Math.abs(varPct); const rColor = risk < 0.05 ? '#06d6a0' // green — low risk : risk < 0.10 ? '#f59e0b' // amber — moderate : '#ef4444'; // red — high risk
🎲 Interactive VaR Simulator
Adjust parameters and re-run the Monte Carlo — same engine as the live dashboard
VaR Amount
Maximum expected loss
VaR %
As % of portfolio
Paths Run
Simulated scenarios
§ 06 — System Architecture

How It All Connects

The full stack from data source to user interface.

Data Layer
Yahoo Finance via yfinance. Adjusted closing prices for global equity indices. Fetched on a cron schedule inside the Cloudflare Worker.
Compute Layer
Cloudflare Worker (Edge). Computes log-returns, applies forward-fill, stores processed JSON in KV. Admin-authenticated force-refresh endpoint.
Cache Layer
Cloudflare KV (key-value store). Holds two JSON blobs: prices and returns. Dashboard reads exclusively from KV — no direct Yahoo Finance calls at runtime.
Presentation Layer
Static HTML + Chart.js. Single-page application. Market Trends view (dual-axis price + return chart) and VaR view (simulation + histogram). Hosted on Cloudflare Pages.
VaR Engine
Runs entirely client-side in JavaScript. Reads pre-computed returns from KV, applies Box-Muller transform, sorts simulation results, and extracts the tail-loss quantile.
Export Facility
CSV generation in-browser. Users select asset(s), date range, and data type (prices or log-returns). A Blob URL is generated and auto-downloaded — no server involved.
Fully Serverless

Because the compute layer is a Cloudflare Worker and the frontend is hosted on Cloudflare Pages, there is no traditional server to maintain. Cold-start times are sub-millisecond, global edge nodes serve the dashboard from the nearest PoP, and the VaR Monte Carlo runs in the user's own browser — zero backend cost at request time.

ParameterDefaultRange
Lookback Window1 Year (252 days)1 Month → All History
Confidence Level95%90%, 95%, 99%
Time Horizon1 Day1 Day → 1 Year
Simulation Paths10,0001,000 → 250,000
Min. Data Points for VaR30Hard minimum enforced
Histogram Bins60Fixed
↗   Visit Dr. Susovon Jana's Website