146
v1v2v3v4 (latest)

EVINCE: Optimizing Multi-LLM Dialogues Using Conditional Statistics and Information Theory

Edward Y. Chang
Main:8 Pages
10 Figures
Bibliography:2 Pages
10 Tables
Appendix:14 Pages
Abstract

EVINCE (Entropy and Variation IN Conditional Exchanges) is a novel framework for optimizing multi-LLM dialogues using conditional statistics and information theory. It addresses limitations in multi-agent debate (MAS) frameworks, where multiple LLMs ``chat'' without behavior modulation or mutual information quality assessment. Using dual entropy optimization to balance perspective diversity and prior knowledge, \EVINCE\EVINCE provides quantitative tools to dynamically regulate LLM linguistic behaviors. When mutual information is low and both cross-entropy and Wasserstein distance are high, EVINCE promotes contentious dialogues to expose diverse perspectives and uncover inconsistencies. Conversely, as cross-entropy decreases and mutual information stabilizes, it transitions discussions into a conciliatory phase, encouraging compromise and acknowledgment of valid points. Using information-theoretic metrics and optimizing mutual information, \EVINCE\EVINCE emerges as a structured and highly effective framework for multi-LLM collaboration.

View on arXiv
Comments on this paper