Skip to content
STAGING — not production

Infrastructure

Measuring Latency in the Browser: A Meta Demo

The RTT badge on this site measures real latency. Here's how, using the Performance API, Cloudflare edge, and the same principles that apply to trading.

5 min
#latency #monitoring #performance #cloudflare #infrastructure #observability

The floating badge in the corner shows your actual round-trip time to my server.

RTT 24ms · σ ±3 (iad)
```bash

That's not decoration. It's a demonstration of the same latency awareness I apply to trading systems.


## What the Badge Shows {#what-it-shows}

| Metric | Meaning |
|--------|---------|
| **RTT** | Round-trip time in milliseconds |
| **σ** | Standard deviation (jitter) of last 10 samples |
| **Edge** | Cloudflare edge region serving you |

### Why These Metrics?

**RTT:** The total time for a request to travel from browser → server → browser. Includes:
- DNS resolution (cached after first)
- TCP handshake (kept alive)
- TLS negotiation (session resumed)
- HTTP request/response
- Edge compute time (<1ms)

**Standard Deviation (σ):** More important than raw RTT. Low jitter means consistent performance. High jitter means unpredictable spikes, the enemy of trading systems.

**Edge Region:** Which datacenter served you. Cloudflare routes to the nearest edge automatically.


## The Ping Endpoint {#ping-endpoint}

```typescript
// Cloudflare Worker: /api/ping
export default {
  async fetch(request: Request): Promise<Response> {
    // Cloudflare adds the CF-Ray header with datacenter code
    const cfRay = request.headers.get('cf-ray') || '';
    const edge = cfRay.split('-')[1] || 'unknown'; // e.g., "iad", "lhr", "sin"

    return new Response(JSON.stringify({
      edge,
      ts: Date.now()
    }), {
      status: 200,
      headers: {
        'Content-Type': 'application/json',
        'Cache-Control': 'no-store, no-cache, must-revalidate',
      }
    });
  }
};
```bash

**Key design decisions:**

| Decision | Why |
|----------|-----|
| Cloudflare Worker | Runs at edge, no cold starts |
| `no-store` | Every request hits the edge, no CDN cache |
| Minimal payload | Only edge + timestamp |
| JSON response | Easy parsing |


## Client-Side Measurement {#client-side}

```javascript
const samples = [];
const MAX_SAMPLES = 10;

async function measureLatency() {
  // Cache-busting query parameter
  const url = `/api/ping?t=${Date.now()}`;

  const t0 = performance.now();

  const response = await fetch(url, {
    method: 'GET',
    cache: 'no-store',
    mode: 'cors'
  });

  const t1 = performance.now();
  const rtt = t1 - t0;

  const data = await response.json();

  // Rolling buffer
  samples.push(rtt);
  if (samples.length > MAX_SAMPLES) {
    samples.shift();
  }

  // Calculate statistics
  const avg = samples.reduce((a, b) => a + b, 0) / samples.length;
  const sigma = calculateStdDev(samples);

  updateDisplay(Math.round(avg), Math.round(sigma), data.edge);
}

function calculateStdDev(arr) {
  const n = arr.length;
  if (n < 2) return 0;

  const mean = arr.reduce((a, b) => a + b, 0) / n;
  const variance = arr.reduce((sum, val) => sum + Math.pow(val - mean, 2), 0) / n;
  return Math.sqrt(variance);
}

// Measure every 10 seconds
setInterval(measureLatency, 10000);
measureLatency();  // Initial measurement
```bash

### Why `performance.now()`?

| API | Resolution | Use Case |
|-----|------------|----------|
| `Date.now()` | 1ms | General timing |
| `performance.now()` | Sub-millisecond | High precision |
| `performance.timeOrigin` | Absolute time | Cross-origin correlation |

`performance.now()` provides sub-millisecond resolution and is monotonic (won't go backwards). **Citation:** [Performance API spec](https://www.w3.org/TR/hr-time/).


## Statistics: Why Standard Deviation Matters {#statistics}

### Average Latency is Vanity

Consider two networks:
- **Network A:** 50ms, 50ms, 50ms, 50ms, 50ms → Avg: 50ms, σ: 0ms
- **Network B:** 10ms, 10ms, 10ms, 10ms, 170ms → Avg: 42ms, σ: 64ms

Network B has lower average but one horrible request. For trading:
- Network A: Predictable, can size positions accordingly
- Network B: One spike causes slippage

**Jitter (σ) is the reliability metric.**

### Interpretation

| σ | Quality | Implication |
|---|---------|-------------|
| <5ms | Excellent | Very consistent |
| 5-20ms | Good | Normal variance |
| 20-50ms | Degraded | Network congestion |
| >50ms | Bad | Investigate |


## Design Philosophy {#design-philosophy}

### Measure What You Care About

Traditional metrics for websites:
- Page load time
- Time to first byte
- Largest contentful paint

**These aggregates hide latency distribution.** A site can have great average load time but terrible P99.

### The Trading Parallel

| Web Metric | Trading Metric |
|------------|----------------|
| RTT | Fill latency |
| Jitter (σ) | P99 variance |
| Edge region | Exchange connectivity |
| Cache miss | Order rejection |

The same principle applies: measure continuously, track percentiles, minimize variance.

### Show, Don't Tell

Anyone can claim to care about performance. **The badge proves it:**
- Real measurement, not synthetic
- Updated every 10 seconds
- Visible on every page


## Build Your Own {#build-your-own}

### Minimal Implementation

```html
<!DOCTYPE html>
<html>
<head>
  <style>
    #rtt-badge {
      position: fixed;
      bottom: 20px;
      right: 20px;
      background: #1a1a1a;
      color: #00ff88;
      font-family: monospace;
      font-size: 12px;
      padding: 8px 12px;
      border-radius: 4px;
      z-index: 1000;
    }
  </style>
</head>
<body>
  <div id="rtt-badge">RTT: --ms</div>

  <script>
    const samples = [];

    async function measure() {
      const t0 = performance.now();
      await fetch('/api/ping?t=' + Date.now(), { cache: 'no-store' });
      const rtt = Math.round(performance.now() - t0);

      samples.push(rtt);
      if (samples.length > 10) samples.shift();

      const avg = Math.round(samples.reduce((a, b) => a + b) / samples.length);
      const sigma = Math.round(Math.sqrt(
        samples.reduce((s, v) => s + Math.pow(v - avg, 2), 0) / samples.length
      ));

      document.getElementById('rtt-badge').textContent =
        `RTT: ${avg}ms · σ ±${sigma}`;
    }

    setInterval(measure, 10000);
    measure();
  </script>
</body>
</html>
```text

### AWS Alternative

If you're running on AWS rather than Cloudflare:

```hcl
# CloudWatch Synthetics canary
resource "aws_synthetics_canary" "latency" {
  name                 = "latency-monitor"
  artifact_s3_location = "s3://${aws_s3_bucket.canary.bucket}/canary/"
  execution_role_arn   = aws_iam_role.canary.arn
  handler              = "latency.handler"
  runtime_version      = "syn-python-selenium-1.0"

  schedule {
    expression = "rate(1 minute)"
  }
}

Audit Your Infrastructure

Want to check if your servers are configured for low latency? Try the latency-audit tool — it validates kernel settings, CPU governors, and network configurations in seconds.

Continue Reading

Reading Path

Continue exploring with these related deep dives:

TopicNext Post
The 5 kernel settings that cost you latencyLinux Defaults That Cost You Latency
Measuring without overhead using eBPFeBPF Profiling: Nanoseconds Without Adding Any
Design philosophy & architecture decisionsTrading Infrastructure: First Principles That Scale
SLOs, metrics that matter, alertingTrading Metrics: What SRE Dashboards Miss
Share: LinkedIn X

Enjoyed this?

Get one deep infrastructure insight per week.

Free forever. Unsubscribe anytime.

You're in. Check your inbox.