arXiv:2604.25110v1 Announce Type: cross
Abstract: This position paper argues that knowledge distillation must account for what it loses: student models should be judged not only by retained task scores, but by whether they preserve the teacher capabilities that make those scores reliable. This matters because distillation is increasingly used to turn large, often frontier models into deployable systems, yet headline metrics can hide losses in uncertainty, boundary behavior, process reliability, on-policy stability, grounding, privacy, safety, and diversity. We identify the retention assumption behind current evaluation and reframe distillation as a lossy projection of teacher behavior rather than a faithful copy. We then synthesize existing evidence into a taxonomy of off-metric distillation losses, showing that these losses are concrete, recurring, and measurable. To make the position actionable, we propose scenario-specific preservation targets and a Distillation Loss Statement that reports what was preserved, what was lost, and why the remaining losses are acceptable. The goal is not lossless distillation, but accountable distillation.
On a Keller-Segel type equation to model Brain Microvascular Endothelial Cells growth’s patterns
arXiv:2604.25180v1 Announce Type: cross Abstract: This article presents a partial differential equation (PDE) of Keller-Segel (KS) type that reproduces patterns commonly observed during the growth


