arXiv:2601.20231v1 Announce Type: cross
Abstract: We study black-box optimization of Lipschitz functions under noisy evaluations. Existing adaptive discretization methods implicitly avoid suboptimal regions but do not provide explicit certificates of optimality or measurable progress guarantees. We introduce textbfCertificate-Guided Pruning (CGP), which maintains an explicit emphactive set $A_t$ of potentially optimal points via confidence-adjusted Lipschitz envelopes. Any point outside $A_t$ is certifiably suboptimal with high probability, and under a margin condition with near-optimality dimension $alpha$, we prove $Vol(A_t)$ shrinks at a controlled rate yielding sample complexity $tildeO(varepsilon^-(2+alpha))$. We develop three extensions: CGP-Adaptive learns $L$ online with $O(log T)$ overhead; CGP-TR scales to $d > 50$ via trust regions with local certificates; and CGP-Hybrid switches to GP refinement when local smoothness is detected. Experiments on 12 benchmarks ($d in [2, 100]$) show CGP variants match or exceed strong baselines while providing principled stopping criteria via certificate volume.

Subscribe for Updates

Copyright 2025 dijee Intelligence Ltd.   dijee Intelligence Ltd. is a private limited company registered in England and Wales at Media House, Sopers Road, Cuffley, Hertfordshire, EN6 4RY, UK registeration number 16808844