Global Virtual Time (GVT) is used in distributed simulations to reclaim memory, commit output, detect termination, and handle errors. It is a global function that is computed many times during the coures of a simulation. A Small GVT Latency (delay between its occurence and detection) allows for more efficient use of rsources. We present an algorithms which minimizes the latency, and we prove its correctness. The algorithm does not require message to be acknowledged , which significantly reduces them message overhead of the simulation. One posible application is with interactive simulators, where regular and timely uodates would produce output that is up to data and sppears smooth