An investigation of camera movements and capture techniques on optical flow for real-time rendering and presentation

Document Type

Article

Publication Title

Journal of Real-Time Image Processing

Abstract

New and interesting uses for portable devices include the creation and viewing of 3D models and 360-degree photos of real landscapes. To provide a 3D model and a 360-degree view of a scenario, these apps search for real-time rendering and presentation. This study examines the impact of real-time image processing on movements in camera view and the application of optical flow algorithms. The optical flows are affected by how the objects involved motion. The way the image is recorded and the camera movements affect the relative motion. As a consequence, camera motions are responsive to optical flow algorithms. To record an image, the camera may pan around, move in a straight path, or move randomly. We have captured datasets of videos produced in different contexts to better replicate real-world scenarios. Each dataset was captured in a variety of illumination situations, camera movements, and indoor and outdoor recording sites. Here, to determine the most effective optical flow algorithms for use in near real-time applications, such as augmented reality and virtual reality, we compare results based on quality and processing delay for each video frame. We conducted a comparison study to better understand how random camera motion affects real-time video processing. These methods can be used to handle a variety of real-world issues, such as object tracking, video segmentation, structure from motion, gesture tracking, and so on.

DOI

10.1007/s11554-023-01322-7

Publication Date

8-1-2023

This document is currently not available here.

Share

COinS