148

EVINCE: Optimizing Adversarial LLM Dialogues via Conditional Statistics and Information Theory

Edward Y. Chang
Main:8 Pages
10 Figures
Bibliography:2 Pages
10 Tables
Appendix:14 Pages
Abstract

This paper introduces EVINCE (Entropy and Variation IN Conditional Exchanges), a framework that optimizes multi-LLM dialogues using conditional statistics and information theory. EVINCE introduces dual entropy optimization to balance perspective diversity with prior knowledge, providing quantitative measures for modulating LLM interactions. Through information-theoretic metrics and mutual information optimization, the framework demonstrates consistent improvement over single-LLM performance in applications ranging from disease diagnosis to news debiasing. We present theoretical foundations and empirical validation for this structured approach to LLM collaboration.

View on arXiv
Comments on this paper