Agent Conversation: google/gemini-3.1-pro-preview
Task: ml-subgroup-calibration-shift
Subgroup Calibration Under Distribution Shift
Research Question
Design a post-hoc calibration method that remains reliable when subgroup composition shifts between calibration and test time.
Background
Many calibration methods look good on average but fail on protected or operational subgroups once the test distribution shifts. This task isolates that failure mode. The fixed pipeline trains a tabular classifier, then applies a user-defined calibration mapping on held-out calibration data before evaluation on shifted test data.
Classical baselines include:
- Temperature scaling: one global temperature for all samples
- Isotonic regression: non-parametric monotone calibration
- Beta calibration: a richer parametric mapping on probabilities
- Group-wise temperature scaling: separate temperatures per subgroup
Task
Modify the CalibrationMethod class in custom_subgroup_calibration.py. The fixed code loads data, creates a shifted split, trains the base classifier, and computes metrics. Your method only controls the post-hoc calibration mapping.
class CalibrationMethod:
def fit(self, probs, labels, groups=None):
...
def predict_proba(self, probs, groups=None):
...
Inputs are positive-class probabilities from the base classifier. groups contains subgroup IDs when available and may be ignored by group-agnostic methods.
Evaluation
This benchmark uses three lightweight tabular proxies that are already available in the current scikit-learn package setup. We would normally prefer Adult, ACSIncome, COMPAS, and Law School Admissions, but those require package-level data changes that are outside this task directory. To keep the benchmark runnable offline, we use cached scikit-learn datasets with similar calibration and subgroup-shift behavior:
- breast_cancer: binary classification on the scikit-learn breast cancer dataset
- california_housing: binary high-value/low-value decision built from California housing
- diabetes: binary high-risk/low-risk decision built from the diabetes target
For each dataset, the split is intentionally shifted:
- a domain score determines the held-out test tail
- subgroup labels are quartiles of a separate proxy feature
- calibration is fit on the source region and evaluated on the shifted region
Metrics
Lower is better for:
- worst_group_ece
- brier
- max_subgroup_gap
Higher is better for:
- subgroup_auroc
Notes
- The task is deliberately low compute and should run with a small tabular classifier.
- If you need the exact Adult/ACSIncome/COMPAS/Law School datasets, they should be added through a package-level data change, not inside this task directory.
scikit-learn/custom_subgroup_calibration.py [EDITABLE — lines 72–91 only]
1: """Subgroup calibration under distribution shift.
2:
3: The benchmark is intentionally offline and low compute. It usesTool Input
1{
2 "after_line": 72,
3 "start_line": 73,
4 "filename": "custom_subgroup_calibration.py",
5 "end_line": 91,
6 "content": " \"\"\"Hierarchical Platt scaling.\"\"\"\n\n def __init__(self):\n self.eps = 1e-6\n self.global_coef_ = (1.0, 0.0)\n self.group_coefs_ = {}\n\n def fit(self, probs, labels, groups=None):\n probs = np.asarray(probs).reshape(-1)\n labels = np.asarray(labels).reshape(-1).astype(int)\n logits = special.logit(np.clip(probs, self.eps, 1.0 - self.eps))\n\n def loss_fn(a, b, m_logits, m_labels):\n p = special.expit(a * m_logits + b)\n p = np.clip(p, self.eps, 1.0 - self.eps)\n return float(-np.mean(m_labels * np.log(p) + (1 - m_labels) * np.log(1 - p)))\n\n def obj_global(x):\n return loss_fn(float(x[0]), float(x[1]), logits, labels)\n\n res_global = optimize.minimize(obj_global, x0=np.array([1.0, 0.0]), method=\"L-BFGS-B\", bounds=[(0.01, 10.0), (-10.0, 10.0)])\n self.global_coef_ = tuple(res_global.x) if res_global.success else (1.0, 0.0)\n\n if groups is not None:\n groups = np.asarray(groups).reshape(-1)\n for g in np.unique(groups):\n mask = groups == g\n if mask.sum() < 5 or np.unique(labels[mask]).size < 2:\n self.group_coefs_[int(g)] = self.global_coef_\n continue\n\n def obj_group(x):\n l = loss_fn(float(x[0]), float(x[1]), logits[mask], labels[mask])\n reg = 0.1 * ((float(x[0]) - self.global_coef_[0])**2 + (float(x[1]) - self.global_coef_[1])**2)\n return l + reg\n\n res_g = optimize.minimize(obj_group, x0=np.array(self.global_coef_), method=\"L-BFGS-B\", bounds=[(0.01, 10.0), (-10.0, 10.0)])\n self.group_coefs_[int(g)] = tuple(res_g.x) if res_g.success else self.global_coef_\n\n return self\n\n def predict_proba(self, probs, groups=None):\n probs = np.asarray(probs).reshape(-1)\n logits = special.logit(np.clip(probs, self.eps, 1.0 - self.eps))\n\n if groups is None:\n a, b = self.global_coef_\n out = special.expit(a * logits + b)\n else:\n groups = np.asarray(groups).reshape(-1)\n out = np.empty_like(probs)\n for g in np.unique(groups):\n mask = groups == g\n a, b = self.group_coefs_.get(int(g), self.global_coef_)\n out[mask] = special.expit(a * logits[mask] + b)\n\n return np.clip(out, self.eps, 1.0 - self.eps)",
7 "op": "replace"
8}ERROR: Package 'custom_subgroup_calibration.py' is not in allowed packages
Tool Input
1{
2 "after_line": 72,
3 "end_line": 91,
4 "content": " \"\"\"Hierarchical Platt scaling.\"\"\"\n\n def __init__(self):\n self.eps = 1e-6\n self.global_coef_ = (1.0, 0.0)\n self.group_coefs_ = {}\n\n def fit(self, probs, labels, groups=None):\n probs = np.asarray(probs).reshape(-1)\n labels = np.asarray(labels).reshape(-1).astype(int)\n logits = special.logit(np.clip(probs, self.eps, 1.0 - self.eps))\n\n def loss_fn(a, b, m_logits, m_labels):\n p = special.expit(a * m_logits + b)\n p = np.clip(p, self.eps, 1.0 - self.eps)\n return float(-np.mean(m_labels * np.log(p) + (1 - m_labels) * np.log(1 - p)))\n\n def obj_global(x):\n return loss_fn(float(x[0]), float(x[1]), logits, labels)\n\n res_global = optimize.minimize(obj_global, x0=np.array([1.0, 0.0]), method=\"L-BFGS-B\", bounds=[(0.01, 10.0), (-10.0, 10.0)])\n self.global_coef_ = tuple(res_global.x) if res_global.success else (1.0, 0.0)\n self.group_coefs_ = {}\n\n if groups is not None:\n groups = np.asarray(groups).reshape(-1)\n for g in np.unique(groups):\n mask = groups == g\n if mask.sum() < 5 or np.unique(labels[mask]).size < 2:\n self.group_coefs_[int(g)] = self.global_coef_\n continue\n\n def obj_group(x):\n l = loss_fn(float(x[0]), float(x[1]), logits[mask], labels[mask])\n reg = 0.5 * ((float(x[0]) - self.global_coef_[0])**2 + (float(x[1]) - self.global_coef_[1])**2)\n return l + reg\n\n res_g = optimize.minimize(obj_group, x0=np.array(self.global_coef_), method=\"L-BFGS-B\", bounds=[(0.01, 10.0), (-10.0, 10.0)])\n self.group_coefs_[int(g)] = tuple(res_g.x) if res_g.success else self.global_coef_\n\n return self\n\n def predict_proba(self, probs, groups=None):\n probs = np.asarray(probs).reshape(-1)\n logits = special.logit(np.clip(probs, self.eps, 1.0 - self.eps))\n\n if groups is None:\n a, b = self.global_coef_\n out = special.expit(a * logits + b)\n else:\n groups = np.asarray(groups).reshape(-1)\n out = np.empty_like(probs)\n for g in np.unique(groups):\n mask = groups == g\n a, b = self.group_coefs_.get(int(g), self.global_coef_)\n out[mask] = special.expit(a * logits[mask] + b)\n\n return np.clip(out, self.eps, 1.0 - self.eps)",
5 "filename": "scikit-learn/custom_subgroup_calibration.py",
6 "start_line": 73,
7 "op": "replace"
8}OK: Replaced lines 73..91 with 58 line(s) in scikit-learn/custom_subgroup_calibration.py. Editable range: 72–130.
[Current file: scikit-learn/custom_subgroup_calibration.py | editable: 72–130 | total: 271 lines]
72 class CalibrationMethod:
73 """Hierarchical Platt scaling."""
74
... (53 more lines) ...
128 out[mask] = special.expit(a * logits[mask] + b)
129
130 return np.clip(out, self.eps, 1.0 - self.eps)[Test #1] (2 tests remaining; call submit(n=N) to choose which test result to submit as final)
### breast_cancer (scripts/breast_cancer.sh)
Training progress:
TRAIN_METRICS: dataset=breast_cancer cal_ece_before=0.014909 cal_ece_after=0.000007 cal_brier_before=0.001191 cal_brier_after=0.000000
Final metrics (breast_cancer): worst_group_ece_breast_cancer=0.320289, brier_breast_cancer=0.155423, subgroup_auroc_breast_cancer=0.988741, max_subgroup_gap_breast_cancer=0.313956
### diabetes (scripts/diabetes.sh)
Training progress:
TRAIN_METRICS: dataset=diabetes cal_ece_before=0.141277 cal_ece_after=0.108762 cal_brier_before=0.154074 cal_brier_after=0.151141
Final metrics (diabetes): worst_group_ece_diabetes=0.113402, brier_diabetes=0.160231, subgroup_auroc_diabetes=0.765085, max_subgroup_gap_diabetes=0.044396
[Leaderboard] Results saved: {'worst_group_ece_breast_cancer': 0.320289, 'brier_breast_cancer': 0.155423, 'subgroup_auroc_breast_cancer': 0.988741, 'max_subgroup_gap_breast_cancer': 0.
Tool Input
1{
2 "filename": "scikit-learn/custom_subgroup_calibration.py",
3 "after_line": 82,
4 "start_line": 83,
5 "op": "replace",
6 "content": " def loss_fn(a, b, m_logits, m_labels):\n p = special.expit(a * m_logits + b)\n p = np.clip(p, self.eps, 1.0 - self.eps)\n # Optimize Brier score instead of NLL for robustness to outliers / extrapolation\n return float(np.mean((m_labels - p)**2))\n\n def obj_global(x):\n return loss_fn(float(x[0]), float(x[1]), logits, labels)\n\n res_global = optimize.minimize(obj_global, x0=np.array([1.0, 0.0]), method=\"L-BFGS-B\", bounds=[(0.01, 5.0), (-5.0, 5.0)])\n self.global_coef_ = tuple(res_global.x) if res_global.success else (1.0, 0.0)\n self.group_coefs_ = {}\n\n if groups is not None:\n groups = np.asarray(groups).reshape(-1)\n for g in np.unique(groups):\n mask = groups == g\n if mask.sum() < 5 or np.unique(labels[mask]).size < 2:\n self.group_coefs_[int(g)] = self.global_coef_\n continue\n\n def obj_group(x):\n l = loss_fn(float(x[0]), float(x[1]), logits[mask], labels[mask])\n reg = 0.01 * ((float(x[0]) - self.global_coef_[0])**2 + (float(x[1]) - self.global_coef_[1])**2)\n return l + reg\n\n res_g = optimize.minimize(obj_group, x0=np.array(self.global_coef_), method=\"L-BFGS-B\", bounds=[(0.01, 5.0), (-5.0, 5.0)])\n self.group_coefs_[int(g)] = tuple(res_g.x) if res_g.success else self.global_coef_",
7 "end_line": 108
8}OK: Replaced lines 83..108 with 28 line(s) in scikit-learn/custom_subgroup_calibration.py. Editable range: 72–132.
[Current file: scikit-learn/custom_subgroup_calibration.py | editable: 72–132 | total: 273 lines]
72 class CalibrationMethod:
73 """Hierarchical Platt scaling."""
74
... (55 more lines) ...
130 out[mask] = special.expit(a * logits[mask] + b)
131
132 return np.clip(out, self.eps, 1.0 - self.eps)[Test #2] (1 test remaining; call submit(n=N) to choose which test result to submit as final)
### breast_cancer (scripts/breast_cancer.sh)
le_gemini-3.1-pro-preview_20260414_110948/scikit-learn/custom_subgroup_calibration.py", line 273, in <module>
main()
File "/srv/home/bohanlyu/MLS-Bench/vendor/workspace/ml-subgroup-calibration-shift/google_gemini-3.1-pro-preview_20260414_110948/scikit-learn/custom_subgroup_calibration.py", line 250, in main
method = CalibrationMethod().fit(cal_probs, y[calib_idx], groups=groups[calib_idx])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/srv/home/bohanlyu/MLS-Bench/vendor/workspace/ml-subgroup-calibration-shift/google_gemini-3.1-pro-preview_20260414_110948/scikit-learn/custom_subgroup_calibration.py", line 92, in fit
res_global = optimize.minimize(obj_global, x0=np.array([1.0, 0.0]), method="L-BFGS-B", bounds=[(0.01, 5.0), (-5.0, 5.0)])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Undo complete: Restored: scikit-learn/custom_subgroup_calibration.py
Tool Input
1{
2 "op": "replace",
3 "end_line": 108,
4 "after_line": 82,
5 "content": " def loss_fn(a, b, m_logits, m_labels):\n p = special.expit(a * m_logits + b)\n p = np.clip(p, self.eps, 1.0 - self.eps)\n # Optimize Brier score instead of NLL for robustness to outliers / extrapolation\n return float(np.mean((m_labels - p)**2))\n\n def obj_global(x):\n return loss_fn(float(x[0]), float(x[1]), logits, labels)\n\n res_global = optimize.minimize(obj_global, x0=np.array([1.0, 0.0]), method=\"L-BFGS-B\", bounds=[(0.01, 5.0), (-5.0, 5.0)])\n self.global_coef_ = tuple(res_global.x) if res_global.success else (1.0, 0.0)\n self.group_coefs_ = {}\n\n if groups is not None:\n groups = np.asarray(groups).reshape(-1)\n for g in np.unique(groups):\n mask = groups == g\n if mask.sum() < 5 or np.unique(labels[mask]).size < 2:\n self.group_coefs_[int(g)] = self.global_coef_\n continue\n\n def obj_group(x):\n l = loss_fn(float(x[0]), float(x[1]), logits[mask], labels[mask])\n reg = 0.01 * ((float(x[0]) - self.global_coef_[0])**2 + (float(x[1]) - self.global_coef_[1])**2)\n return l + reg\n\n res_g = optimize.minimize(obj_group, x0=np.array(self.global_coef_), method=\"L-BFGS-B\", bounds=[(0.01, 5.0), (-5.0, 5.0)])\n self.group_coefs_[int(g)] = tuple(res_g.x) if res_g.success else self.global_coef_",
6 "start_line": 83,
7 "filename": "scikit-learn/custom_subgroup_calibration.py"
8}OK: Replaced lines 83..108 with 28 line(s) in scikit-learn/custom_subgroup_calibration.py. Editable range: 72–132.
[Current file: scikit-learn/custom_subgroup_calibration.py | editable: 72–132 | total: 273 lines]
72 class CalibrationMethod:
73 """Hierarchical Platt scaling."""
74
... (55 more lines) ...
130 out[mask] = special.expit(a * logits[mask] + b)
131
132 return np.clip(out, self.eps, 1.0 - self.eps)[Test #3] (0 tests remaining; call submit(n=N) to choose which test result to submit as final)
[NOTE] This was your last test. You MUST now call submit(n=X) to choose which test result to submit as your final answer.
### breast_cancer (scripts/breast_cancer.sh)
le_gemini-3.1-pro-preview_20260414_110948/scikit-learn/custom_subgroup_calibration.py", line 273, in <module>
main()
File "/srv/home/bohanlyu/MLS-Bench/vendor/workspace/ml-subgroup-calibration-shift/google_gemini-3.1-pro-preview_20260414_110948/scikit-learn/custom_subgroup_calibration.py", line 250, in main
method = CalibrationMethod().fit(cal_probs, y[calib_idx], groups=groups[calib_idx])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/srv/home/bohanlyu/MLS-Bench/vendor/workspace/ml-subgroup-calibration-shift/google_gemini-3.1-pro-preview_20260414_110948/scikit-learn/custom_subgroup_calibration.py", line 92, in fit
res_global = optimize.minimize(obj_global, x0=np.
Undo complete: Restored: scikit-learn/custom_subgroup_calibration.py