How To Make A Oculus Rift Control A Panning Camera
Caput Tracking for Wireless 3D First Person Vision
A remote controlled vehicle with stereo cameras on a caput-tracked pan-tilt unit and wireless 3D starting time person control via an Oculus Rift.
Advanced Showcase (no instructions) 13,188
Things used in this project
Story
What is information technology about?
In remote vehicle operation, video feedback provides the operator with existent-time spatial and ecology data. Stereoscopic video feedback tin can improve upon standard single camera systems by providing the operator with details about scale, depth and move in a rapid and intuitive class. This supplementary information is peculiarly important when operating in unfamiliar or hazardous environments.
Why did I arrive?
This projection was for a academy mechatronic engineering science assignment.
The scope of the design was to test the effects on vehicle control when using a 3D visual telepresence organisation to navigate environment. An effective design would evangelize heightened interactivity, control, navigation and spatial sensation to the driver, whilst minimizing the negative furnishings of motion sickness.
How does it work?
A 3D first person vision system has been developed, which captures stereoscopic perspectives with a pair of small cameras and displays them to a user's respective eyes via a head mounted brandish. The caput mounted display simultaneously captures positional and rotational values of the user's head, using this data to manipulate a pan-tilt unit on which the cameras are mounted. The result is delivery of a minimal latency 3D perspective of the remote controlled vehicles surround, where the field of view can be mechanically adjusted by rotating the head in the desired direction (see kickoff attached video).
There are three chief modules that brand upward the project:
- Video Capture and Communication (blue)
- Headtracker Data Communication (green)
- Pan-Tilt unit (PTU) with Controller (yellow)
Breakdown of 3 primary modules - by colour
An Arduino Uno served as a remote (vehicle mounted) platform for manipulating the PTU, via ii digital servos embedded inside a 3D printed enclosure. At a minimum the blueprint required at to the lowest degree two Pulse Width Modulated signal outputs, an RS232 UART/USART Serial Controller, and boosted I/O pins for additional sensors/actuators (such as dynamic stability). The Arduino was selected over a conventional microcontroller on the footing of:
- High level language with pre-compiled libraries,
- On-lath power regulator, bootloader, etc
- Ease of interface to external devices, such as XBee wireless serial devices
- Processing speed and retention command not critical to minimize latency, when combined with already substantial video latency.
Results
Various different hypothesis related to motion sickness, command, cognitive memory retentivity and depth perception were all tested from the epitome blueprint, simply when focusing on hardware operation, the Arduino Uno was a simple and reliable platform.
The main target criteria for a successful design was anticipated as;
- A wide enough field of view (>100 degrees)
- Adequate video resolution
- Authentic and responsive tracking
- A high refresh rate
- Depression Latency
From these, the chief upshot observed was that to do with minimizing video latency, that is the fourth dimension from moving your head to receiving the respective image to your optics. The cameras, wireless video communication and display methods created a latency up to 133 milliseconds when stationary, with an additional 26 ms generated via the mechanical response of the head-tracking, serial comm's and mechanical adjustment of the PTU. This 26 ms of data processing/betoken generation is insignificant when compared to that of the video latency.
In replicating this project, there are many issues regarding video equipment I would reconsider to increase paradigm quality whilst minimizing latency. However, regarding the elements relating to the Arduino, I would consider a smaller packaged lath (such equally Arduino Pro Mini) and faster operating servo motors as the only limitation experienced was in the rate of PTU adjustment, non in speed of processing and I/O capacity.
Thank you, Arduino, for meeting my needs in what would otherwise be a daunting project.
Code
Credits
Source: https://www.hackster.io/twhi2525/head-tracking-for-wireless-3d-first-person-vision-71bc33
Posted by: casnerwherted.blogspot.com
0 Response to "How To Make A Oculus Rift Control A Panning Camera"
Post a Comment