RIFE Overview: Real-time Intermediate Flow Estimation Algorithm
RIFE, or Real-time Intermediate Flow Estimation, is an intermediate flow estimation algorithm used in video frame interpolation. The goal of RIFE is to estimate intermediate frames between two input frames at a faster speed and with better accuracy.
Background
Recent flow-based video frame interpolation methods estimate bi-directional optical flows and use them to approximate intermediate flows. However, this can lead to artifacts on motion boundaries. RIFE uses a neural network called IFNet to directly estimate intermediate flows with much better speed and accuracy.
How RIFE Works
In RIFE training, two input frames, $I_{0}$ and $I_{1}$, are directly fed into the IFNet to approximate intermediate flows ($F_{t \rightarrow 0}$ and $F_{t \rightarrow 1}$) and a fusion map ($M$). During the training phase, a privileged teacher refines the student's results based on the ground truth $I_{t}$ to get $F_{t \rightarrow 0}^{Tea}$, $F_{t \rightarrow 1}^{Tea}$, and $M^{Tea}$.
The student model and the teacher model are jointly trained from scratch using the reconstruction loss. The teacher's approximations are more accurate and guide the student to learn better.
Benefits of RIFE
RIFE has several benefits over other video frame interpolation methods. The direct estimation of intermediate frames by IFNet eliminates artifacts on motion boundaries that can occur when optical flows are estimated first. RIFE is also much faster and more accurate than other flow-based methods.
Another advantage of RIFE is its privileged distillation scheme for training intermediate flow models. This leads to a large performance improvement compared to other methods.
RIFE is an innovative intermediate flow estimation algorithm for video frame interpolation. It uses a neural network called IFNet to directly estimate intermediate flows and a privileged distillation scheme for training intermediate flow models. RIFE is faster and more accurate than other flow-based methods and eliminates artifacts on motion boundaries.