Infinite balanced allocation via finite capacities
conference contributionposted on 2021-08-03, 10:58 authored by Petra Berenbrink, Tom Friedetzky, Christopher Hahn, Lukas Hintze, Dominik Kaaser, Peter Kling, Lars NagelLars Nagel
We analyze the following infinite load balancing process, modeled as a classical balls-into-bins game: There are n bins (servers) with a limited capacity (buffer) of size c=c(n)∈N . Given a fixed arrival rate λ=λ(n)∈(0,1) , in every round λn new balls (requests) are generated. Together with possible leftovers from previous rounds, these balls compete to be allocated to the bins. To this end, every ball samples a bin independently and uniformly at random and tries to allocate itself to that bin. Each bin accepts as many balls as possible until its buffer is full, preferring balls of higher age. At the end of the round, every bin deletes the ball it allocated first. We study how the buffer size c affects the performance of this process. For this, we analyze both the number of balls competing each round (including the leftovers from previous rounds) as well as the worst-case waiting time of individual balls. We show that (i) the number of competing balls is at any (even exponentially large) time bounded with high probability by 4⋅c−1⋅ln(1/(1−λ))⋅n+O(c⋅n) and that (ii) the waiting time of a given ball is with high probability at most (4⋅ln(1/(1−λ)))/(c⋅(1−1/e))+loglogn+O(c) . These results indicate a sweet spot for the choice of c around c=Θ(log(1/(1−λ))−−−−−−−−−−−√) . Compared to a related process with infinite capacity [Berenbrink et al., PODC'16], for constant λ the waiting time is reduced from O(logn) to O(loglogn) . Even for large λ≈1−1/n we reduce the waiting time from O(logn) to O(logn−−−−√) .
- Computer Science