Processor: WarmingLevel
Registry key: warming_level | Priority: 10 | Category: Temporal Processing
Subset climate data by global warming level thresholds instead of calendar dates. Transform time-series data to a warming-level-centric approach for climate impact analysis aligned with IPCC warming scenarios.
Algorithm
flowchart TD
Init([__init__: validate dict, store params,<br/>eagerly load GWL_1850_1900 + GWL_1981_2010 CSVs]) --> Start([execute: dict of xr.Datasets / DataArrays])
Start --> Reformat["reformat_member_ids:<br/>split member_id dim into key suffixes"]
Reformat --> Extend["extend_time_domain:<br/>splice historical onto SSP scenarios"]
Extend --> CenterYears["get_center_years:<br/>look up center year per (key, wl)"]
CenterYears --> LoopKeys["For each simulation key"]
LoopKeys --> LoopWL["For each (year, wl) pair"]
LoopWL --> CheckNaN{Year is NaN?}
CheckNaN -->|Yes| WarnSkip["Warn + skip this WL"]
CheckNaN -->|No| ComputeWindow["start = center - window<br/>end = center + window - 1"]
ComputeWindow --> CheckComplete{_determine_is_complete_wl?}
CheckComplete -->|No| WarnSkip
CheckComplete -->|Yes| SliceTime["da_slice = data.sel(time=slice(start, end))"]
SliceTime --> DropFeb29["Drop Feb 29"]
DropFeb29 --> CheckMonths{warming_level_months<br/>specified?}
CheckMonths -->|Yes| FilterMonths["Filter to specified months"]
CheckMonths -->|No| SwapDim["Swap time dim → time_delta<br/>(range -L/2 .. L/2)"]
FilterMonths --> SwapDim
SwapDim --> ExpandDims["expand_dims warming_level<br/>+ assign simulation coord"]
ExpandDims --> Append["Append to slices"]
WarnSkip --> NextWL{More WLs?}
Append --> NextWL
NextWL -->|Yes| LoopWL
NextWL -->|No| Concat["xr.concat(slices, dim='warming_level', join='outer')"]
Concat --> CheckDummy{add_dummy_time?}
CheckDummy -->|Yes| AddDummy["add_dummy_time_to_wl"]
CheckDummy -->|No| StoreCtx["context['_sim_centered_years'] = ..."]
AddDummy --> StoreCtx
StoreCtx --> UpdateCtx["update_context (writes new_attrs)"]
UpdateCtx --> NextKey{More keys?}
NextKey -->|Yes| LoopKeys
NextKey -->|No| End([Output: dict of Datasets<br/>with warming_level + time_delta dims])
click Init "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L73" "__init__ (eager CSV load at L91-94)"
click Reformat "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L132" "reformat_member_ids call site"
click Extend "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L135" "extend_time_domain call site"
click CenterYears "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L150" "get_center_years call site"
click LoopKeys "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L165" "Outer key loop"
click LoopWL "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L173" "Inner (year, wl) loop"
click CheckNaN "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L174" "NaN year guard"
click ComputeWindow "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L186" "Window math"
click CheckComplete "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L189" "_determine_is_complete_wl"
click SliceTime "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L201" "Time slice"
click DropFeb29 "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L204" "Drop Feb 29"
click FilterMonths "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L208" "warming_level_months filter"
click SwapDim "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L215" "Swap time → time_delta"
click ExpandDims "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L222" "expand_dims + assign_coords"
click Concat "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L251" "xr.concat"
click AddDummy "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L256" "add_dummy_time_to_wl"
click StoreCtx "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L267" "Store sim_centered_years in context"
click UpdateCtx "https://github.com/cal-adapt/climakitae/blob/main/climakitae/new_core/processors/warming_level.py#L308" "update_context method"
Execution Flow
- Initialization (lines 73–111): Validate dict, store
warming_levels,warming_level_window(default 15),warming_level_months(defaultUNSET),add_dummy_time(defaultFalse). Eagerly loads two CSV lookup tables (gwl_1850-1900ref.csvat L91 andgwl_1981-2010ref.csvat L94). Setsself.name = "warming_level_simple"(note: this is the metadata key written to context). - Defensive Reload (lines 120–129): If
warming_level_timesis somehowNone, attempt to reload from CSV; raiseRuntimeErroron failure. - Member ID Reformatting (line 132 → method at line 343):
reformat_member_idssplits anymember_iddimension into separate dict keys withkey.member_idsuffix. - Time Domain Extension (line 135):
extend_time_domain(helper) splices historical onto SSP scenarios so the data range covers 1980/1850 – 2100. - Center-Year Lookup (line 150 → method at line 376):
get_center_yearsreturns{key: [year_for_wl1, year_for_wl2, ...]}. Lookup uses tuple(activity_id, member_id, source_id)parsed from each key. - Per-Key, Per-WL Slicing (lines 165–249, nested loops):
- Skip key if no center years (line 168).
- Skip individual WLs where
yearisNone/NaN(lines 174–177). - Compute window:
start_year = center - window,end_year = center + window - 1(lines 184–185). - Skip incomplete WLs via
_determine_is_complete_wl(lines 188–198). data.sel(time=slice(start, end))(line 201).- Drop Feb 29 leap days (lines 204–205).
- Optional month filter via
warming_level_months(lines 208–211). - Swap
timedim fortime_deltaindex (range-L/2 .. L/2) (lines 215–219). expand_dims({"warming_level": [wl]})(line 222) + assignsimulationcoord (line 225).- Append to local
sliceslist. - Concatenation (lines 251–253):
xr.concat(slices, dim="warming_level", join="outer", fill_value=np.nan). - Optional Dummy Time (line 256): If
add_dummy_time=True,add_dummy_time_to_wl(ret[key])restores a synthetictimeaxis. - Context Bookkeeping (lines 267, 271): Store
_sim_centered_yearsmapping in context for downstreamconcatenatereconstruction;update_contextwrites a description undercontext["new_attrs"]["warming_level_simple"].
Parameters
| Parameter | Type | Required | Default | Description | Constraints |
|---|---|---|---|---|---|
warming_levels |
list[float] | ✓ | — | Global warming levels (°C above pre-industrial) | [0.8, 1.5, 2.0, 2.5, 3.0] common; 1.5–4.0 typical |
warming_level_window |
int | 15 | Years before/after central year to include | ≥1; 15 year window typical (30 year total) | |
warming_level_months |
list[int] | UNSET | Months to keep (1–12) | E.g., [6,7,8] for JJA; UNSET = all months | |
add_dummy_time |
bool | False | Replace offset-from-center dimension with dummy time | Useful for tools requiring time dimension |
Code References
| Method | Lines | Purpose |
|---|---|---|
__init__ |
73–111 | Validate config, store params, eagerly load GWL CSV lookup tables (L91, L94) |
execute |
113–273 | Reformat member ids, extend time, look up center years, slice + reshape per GWL, concatenate |
update_context |
308–329 | Write new_attrs["warming_level_simple"] description |
set_data_accessor |
331–341 | Store catalog reference (currently unused) |
reformat_member_ids |
343–374 | Split data with member_id dim into separate dict entries |
get_center_years |
376–521 | Per-(key, wl) center-year lookup against warming_level_times and warming_level_times_idx |
Examples
Single Warming Level
from climakitae.new_core.user_interface import ClimateData
# Extract data at 1.5°C warming
data = (ClimateData()
.catalog("cadcat")
.activity_id("WRF")
.experiment_id("ssp245")
.variable("t2max")
.table_id("day")
.grid_label("d03")
.processes({
"warming_level": {
"warming_levels": [1.5]
}
})
.get())
Multiple Warming Levels
# Compare 1.5°C, 2.0°C, and 3.0°C warming levels
data = (ClimateData()
.catalog("cadcat")
.activity_id("LOCA2")
.experiment_id("ssp370")
.variable("tasmax")
.table_id("day")
.grid_label("d02")
.processes({
"warming_level": {
"warming_levels": [1.5, 2.0, 3.0]
}
})
.get())
# data.warming_level is now a coordinate with 3 values
# Access with: data.sel(warming_level=1.5)
Custom Window
# Use 20-year windows (instead of default 15)
data = (ClimateData()
.catalog("cadcat")
.activity_id("WRF")
.experiment_id("ssp585")
.variable("pr")
.table_id("mon")
.grid_label("d03")
.processes({
"warming_level": {
"warming_levels": [2.0, 2.5],
"warming_level_window": 20
}
})
.get())
Seasonal Filter
# Summer (JJA) only at 2°C warming
data = (ClimateData()
.catalog("cadcat")
.activity_id("WRF")
.experiment_id("ssp245")
.variable("t2max")
.table_id("day")
.grid_label("d03")
.processes({
"warming_level": {
"warming_levels": [2.0],
"warming_level_months": [6, 7, 8] # June, July, August
}
})
.get())
Chained with Clipping
# Full workflow: clip + warming level + export
data = (ClimateData()
.catalog("cadcat")
.activity_id("WRF")
.experiment_id("ssp245")
.variable("t2max")
.table_id("day")
.grid_label("d03")
.processes({
"clip": "San Francisco Bay",
"warming_level": {
"warming_levels": [1.5, 2.0, 3.0],
"warming_level_window": 15
},
"export": {
"filename": "sf_warming_levels",
"file_format": "NetCDF"
}
})
.get())
Implementation Details
Global Warming Level Lookup
GWL timing is pre-computed from climate model simulations and stored in CSV files shipped with climakitae:
gwl_1850-1900ref.csv: Year/timestamp when each(activity_id, member_id, source_id)triple reaches each integer warming level (1850–1900 reference period). Used when the requested WL exists as a column in the table.gwl_1981-2010ref.csv(loaded aswarming_level_times_idx): Time-indexed table of running warming-level estimates per simulation column. Used as a fallback when the requested WL is not a column ingwl_1850-1900ref.csv— the processor finds the first time the simulation column crosses the requested level.
Lookup keys parse the dict key as key.split(".") and use (key_list[2], member_id, key_list[3]), which corresponds to (activity_id, member_id, source_id) in catalog terms. Missing entries log a warning and append np.nan for that warming level (the slice is then skipped).
Time Windows
The processor creates an asymmetric ([center-window, center+window-1]) window:
start_year = center_year - self.warming_level_window
end_year = center_year + self.warming_level_window - 1
da_slice = data.sel(time=slice(f"{start_year}", f"{end_year}"))
With default warming_level_window=15, this is a 30-year span. Feb 29 is then dropped to keep slice lengths consistent across leap and non-leap years.
time_delta Reindexing
After slicing, the time dimension is replaced with a centered integer offset:
length = da_slice.sizes["time"]
time_delta = range(-length // 2, length // 2)
da_slice = da_slice.swap_dims({"time": "time_delta"}).drop_vars("time")
da_slice = da_slice.assign_coords(time_delta=time_delta)
This lets multiple warming levels (with different absolute years but matching window length) share a common dimension before xr.concat.
Edge Cases
- Model doesn't reach GWL:
get_center_yearsappendsnp.nan; the inner loop skips that WL with a warning. - Incomplete window:
_determine_is_complete_wlreturnsFalsewhen the simulation lacks data on either end; skipped with a warning. - No valid WLs for a key: The key is removed from the result dict (lines 239–247).
- Monthly filtering: Applied after time slicing, so per-WL counts may vary if some months are dropped.
Dummy Time (Optional)
Some downstream tools require a real time dimension. Setting add_dummy_time=True calls add_dummy_time_to_wl(ds) which adds a synthetic, monotonically increasing time coordinate back onto the result. Useful for visualization but not climatologically meaningful.
Context Side Effects
context["_sim_centered_years"]: dict mapping each key to the list of valid center years (consumed downstream byconcatenate).context["new_attrs"]["warming_level_simple"]: description string (note the_simplesuffix — this comes fromself.name).
Common Patterns
Compare Scenarios at Same Warming Level
# Historical, SSP2-4.5, SSP5-8.5 at 2°C warming
scenarios = ["historical", "ssp245", "ssp585"]
results = {}
for scenario in scenarios:
results[scenario] = (ClimateData()
.catalog("cadcat")
.activity_id("WRF")
.experiment_id(scenario)
.variable("t2max")
.table_id("day")
.grid_label("d03")
.processes({
"warming_level": {"warming_levels": [2.0]}
})
.get())
Model Uncertainty Across Warming Levels
# Get all 5 WRF models at multiple warming levels
data = (ClimateData()
.catalog("cadcat")
.activity_id("WRF")
.experiment_id("ssp370")
.variable("pr")
.table_id("mon")
.grid_label("d03")
.processes({
"warming_level": {
"warming_levels": [1.5, 2.0, 2.5, 3.0]
}
})
.get())
# data.dims: (warming_level, sim, lat, lon)
# Compute multi-model mean across warming levels
multi_model_mean = data.mean(dim="sim")
See Also
- Processor Index
- Time Slice Processor — Alternative: calendar-based temporal subsetting
- Architecture → Warming Levels Concept
- How-To Guides → Warming Level Analysis
- Cal-Adapt GWL Resources: IPCC AR6 Warming Levels