ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1905.08565
45
4
v1v2v3v4 (latest)

Silent MST approximation for tiny memory

21 May 2019
Lélia Blin
Swan Dubois
Laurent Feuilloley
ArXiv (abs)PDFHTML
Abstract

In this paper we show that approximation can help reduce the space used for self-stabilization. In the classic \emph{state model}, where the nodes of a network communicate by reading the states of their neighbors, an important measure of efficiency is the space: the number of bits used at each node to encode the state. In this model, a classic requirement is that the algorithm has to be \emph{silent}, that is, after stabilization the states should not change anymore. We design a silent self-stabilizing algorithm for the problem of minimum spanning tree, that has a trade-off between the quality of the solution and the space needed to compute it.

View on arXiv
Comments on this paper