Methodology

Documented pipeline and ecological metrics

The ingestion, normalization, enrichment, validation, and publication logic behind the map, the dataset, and the platform interfaces.

Pipeline

A repeatable workflow designed for transparency, repeatability, and metric integrity.

Ingest

Source intake

Normalize

Schema + QA

Enrich

Metrics

Validate

Checks

Publish

Map + API

Step 1

Ingest

Collect municipal inventories and open datasets, preserving provenance and update timestamps.

Step 2

Normalize

Standardize schemas, geocode records, and resolve duplicates across sources.

Step 3

Enrich

Attach canopy metrics, biodiversity indices, and neighborhood metadata.

Step 4

Validate

Run QA checks for missing species, coordinate drift, and anomalous counts.

Step 5

Publish

Serve map layers, metrics, and documentation for research and planning use.

Data Sources

Core tree inventory sources currently represented in the documented trees-table pipeline.

Source
Update cadence
Montreal legacy tree load
Legacy precomputed CSV
Toronto Street Tree Data
Municipal open data refresh
Vancouver Public Trees
Municipal open data refresh
Edmonton Tree Inventory
Municipal open data refresh
Calgary Tree Inventory
Municipal open data refresh
Winnipeg Tree Inventory
Municipal open data refresh

Metrics

Key indicators used to evaluate canopy health and biodiversity.

Metric visuals

Canopy densityExample 42%

Percent cover by neighborhood

Species richnessExample 68%

Unique species count

Diversity indexExample 55%

Shannon + Simpson

Equity scoreExample 35%

Coverage vs population

Values shown are illustrative placeholders.

Canopy density

% cover by neighborhood

Species richness

Unique species count

Diversity index

Shannon + Simpson

Equity score

Coverage vs population

Documentation Backbone

The detailed methodology now comes from a generated backend manifest, not page-local copy. That keeps the frontend aligned with the ETL and load logic.

Documented fields

20

Frontend-visible tree fields with lineage, formulas, null behavior, and implementation references.

City pipelines

6

Source-specific normalization notes, raw column mappings, skip rules, and lineage boundaries.

Validation controls

6

Runtime checks, artifact QC, and manifest-to-code validation checks documented in one contract.

Why this matters

Analysts need to know which fields are directly observed, which are modeled, what thresholds can null values, and where provenance is still defaulted.

That level of detail now lives in a generated tree documentation manifest that can be rendered on the frontend and checked against backend code.

Access points

Public page: /data#transformations

Validation and audit: /data#validation-ledger

JSON contract: /tree-documentation.json