Overview
This guide shows a practical way to detect head-and-shoulders (H&S) chart patterns in Python using only NumPy and pandas. We:
- Extract pivot highs/lows from price series.
- Scan pivots for the H&S sequence (H L H L H) with constraints.
- Validate shoulder similarity, head prominence, neckline consistency, and symmetry.
Works on daily or intraday data. Use highs for peaks and lows for troughs when available; close-only works but is less accurate.
Minimal working example (copy–paste runnable)
import numpy as np
import pandas as pd
# ----------------------
# Pivot detection helpers
# ----------------------
def find_pivots(price: pd.Series, left: int = 3, right: int = 3):
"""Return boolean arrays for pivot highs and lows using strict window maxima/minima.
A pivot high at i is the maximum in [i-left, i+right]; similarly for pivot low.
"""
arr = price.values.astype(float)
n = len(arr)
high = np.zeros(n, dtype=bool)
low = np.zeros(n, dtype=bool)
for i in range(left, n - right):
w = arr[i - left : i + right + 1]
c = arr[i]
# Strict position match to avoid ties across window
if c == w.max() and np.argmax(w) == left:
high[i] = True
if c == w.min() and np.argmin(w) == left:
low[i] = True
return high, low
# ----------------------
# H&S detection
# ----------------------
def detect_head_and_shoulders(
price: pd.Series,
left: int = 3,
right: int = 3,
shoulder_tol: float = 0.05, # shoulders within 5%
neckline_tol: float = 0.03, # neckline lows within 3%
head_min_height: float = 0.02,# head ≥ 2% above shoulders
symmetry_tol: float = 0.5, # left span vs right span ratio bounds
min_sep: int = 3, # min bars between successive pivots
check_break: bool = True, # confirm break below neckline
break_lookahead: int = 30 # bars to look for a break
):
arr = price.values.astype(float)
n = len(arr)
hi, lo = find_pivots(price, left, right)
# Build ordered pivot list
pivots = []
for i in range(n):
if hi[i]:
pivots.append((i, 'H', arr[i]))
if lo[i]:
pivots.append((i, 'L', arr[i]))
pivots.sort(key=lambda x: x[0])
patterns = []
for k in range(len(pivots) - 4):
p1, p2, p3, p4, p5 = pivots[k:k+5]
types = [p1[1], p2[1], p3[1], p4[1], p5[1]]
if types != ['H','L','H','L','H']:
continue
i1, _, shL = p1
i2, _, nL1 = p2
i3, _, head = p3
i4, _, nL2 = p4
i5, _, shR = p5
# separation
if not (i1 + min_sep <= i2 <= i3 - min_sep and i3 + min_sep <= i4 <= i5 - min_sep):
continue
# head prominence
if not (head > shL * (1 + head_min_height) and head > shR * (1 + head_min_height)):
continue
# shoulder similarity
shoulder_avg = (shL + shR) / 2.0
if shoulder_avg == 0:
continue
if abs(shL - shR) / shoulder_avg > shoulder_tol:
continue
# neckline consistency (flat-ish)
neck_avg = (nL1 + nL2) / 2.0
if neck_avg == 0:
continue
if abs(nL1 - nL2) / neck_avg > neckline_tol:
continue
# time symmetry
left_span = i3 - i1
right_span = i5 - i3
if left_span < min_sep or right_span < min_sep:
continue
ratio = left_span / max(right_span, 1)
if not (symmetry_tol <= ratio <= 1 / max(symmetry_tol, 1e-9)):
continue
# Optional: neckline break after right shoulder
break_idx = None
if check_break:
m = (nL2 - nL1) / max((i4 - i2), 1) # slope per bar
b = nL1 - m * i2
for j in range(i5 + 1, min(n, i5 + 1 + break_lookahead)):
neckline_j = m * j + b
if arr[j] < neckline_j:
break_idx = j
break
if break_idx is None:
continue
patterns.append({
'left_shoulder_idx': i1,
'head_idx': i3,
'right_shoulder_idx': i5,
'neckline_low_idxs': (i2, i4),
'break_idx': break_idx,
'metrics': {
'shoulder_diff_pct': abs(shL - shR) / shoulder_avg,
'neckline_diff_pct': abs(nL1 - nL2) / neck_avg,
'head_over_shoulder_pct': (head / shoulder_avg) - 1,
'symmetry_ratio': ratio,
}
})
return patterns
# ----------------------
# Demo: synthetic series containing an H&S
# ----------------------
np.random.seed(7)
base = np.linspace(100, 100, 120) + np.random.normal(0, 0.2, 120)
# craft H&S segment
seg = np.array([
98, 99, 101, 100, # rise to left shoulder
99, # neckline 1
103, 105, 104, # head
100, # neckline 2
102, 101, # right shoulder lower than head
97, 96, 95 # break down
], dtype=float)
series = pd.Series(np.r_[base[:30], seg, base[30:]], name='Close').reset_index(drop=True)
patterns = detect_head_and_shoulders(series, left=2, right=2,
shoulder_tol=0.06, neckline_tol=0.04,
head_min_height=0.02, symmetry_tol=0.4,
min_sep=1, check_break=True)
print(f"Found {len(patterns)} pattern(s)")
for p in patterns:
print(p)
# ----------------------
# Example with your own data (CSV with a Close column)
# ----------------------
# df = pd.read_csv('prices.csv', parse_dates=['Date']).set_index('Date')
# patterns = detect_head_and_shoulders(df['Close'])
# print(f"Found {len(patterns)} pattern(s)")
Quickstart
- Install dependencies:
- pip install pandas numpy
- Prepare data:
- Use a pandas Series of prices. Prefer high/low data when available; otherwise use Close.
- Detect patterns:
- Call detect_head_and_shoulders(series). It returns a list of matches with indices and metrics.
- Act on signals (optional):
- Confirm with additional filters (trend, volume) before trading.
Algorithm steps
- Find pivots:
- A pivot high/low is a local extremum in a centered window [i-left, i+right]. Larger windows reduce noise but miss narrow patterns.
- Build ordered H/L sequence:
- Merge pivot highs and lows into a single chronological list.
- Scan 5-pivot windows:
- Look for H L H L H. The middle H is the head. The outer H’s are shoulders.
- Validate geometry:
- Head higher than both shoulders by head_min_height.
- Shoulders roughly equal height within shoulder_tol.
- Neckline lows within neckline_tol (horizontal-ish). The example uses a flat neckline; allow slope if needed by relaxing the tolerance or checking slope explicitly.
- Time symmetry: left and right spans around the head within symmetry bounds.
- Optional confirmation:
- Require a close below the neckline after the right shoulder within a lookahead window.
Choosing inputs
- Use High for peaks and Low for neckline if you have OHLC. Replace price with df['High'] for highs and use lows from df['Low'] when building pivots separately.
- For close-only data, the method still works but is noisier.
Tuning guidance
- left/right window: 2–5 for intraday, 3–7 for daily.
- shoulder_tol: 3–8% depending on volatility.
- neckline_tol: 2–5% or permit a small slope.
- head_min_height: 2–10% depending on timeframe.
- break_lookahead: 10–50 bars.
Pitfalls
- False positives in choppy markets; widen pivot windows or add trend filters.
- Using equal prices causes tie ambiguity; strict window position check mitigates this.
- Ignoring volatility: percent tolerances should scale with ATR/volatility.
- Data frequency mismatch: daily vs intraday patterns differ in width; retune windows.
- Not validating neckline break can catch incomplete patterns.
Performance notes
- Complexity: O(n) to find pivots and O(P) to scan P pivots.
- Speed-ups for large datasets:
- Replace the for-loop pivot scan with vectorized rolling max/min (centered) to mark pivots.
- JIT with numba for find_pivots and the scan loop.
- Use scipy.signal.argrelextrema for robust extrema detection if SciPy is available.
- Downsample for coarse detection, then refine around candidates.
Extending to inverse H&S
- Mirror the rules: L H L H L with the middle L as head (lowest point). In code, run detection on -price or flip comparisons accordingly.
Tiny FAQ
- Q: Should I use Close or High/Low? A: Prefer High for shoulders/head and Low for neckline. Close-only is acceptable but less precise.
- Q: How to avoid too many signals? A: Increase pivot window and tighten tolerances; require neckline break and trend filters.
- Q: Can the neckline slope? A: Yes. Replace the flat neckline check with a slope threshold and use a line-crossing check for confirmation.
- Q: How do I handle multiple overlapping patterns? A: Keep them; or dedupe by selecting the best-scoring pattern within a sliding time window.