top of page

HYPERVISION LTD

Enables enjoyment of VR:
- Sunglasses size of the device
- Complete human Field of View
Empowering XR/Reality:

- Eyes & mimics tracking

- Foveated Pass-through 240°/360°

Hypervision is a deep-tech startup company developing optical, software & hardware IP empowering Virtual & Extended Reality

The main challenge in VR industry those days is to make every person comfortable while using the VR headset. Therefore, we are developing low-weight, sunglasses size VR/XR optics with image sharpness and immersivity perception equivalent to looking on real scene with complete situational awareness. The compactness is due to patented pancake lenses and horizontal field of view of 240° is due to patented VRDom architecture.

Reference design for HyperOcular 140 (HO140) pancake lens 

- edge-to-edge visual clarity, 18ppd

- FoV: 112°x95° (diagonal 13)

- Supports up to 40ppd

Reference design for VRDOm architecture 

Central & Peripheral HO140 based visual modules

- FoV: 240°x95°

- SDK

The VRDom (VR240.GEN2) architecture makes enablement for VR use-cases with peripheral visual stimulations and/or awareness, previously possible only in Dome Projection rooms available for very limited audience. There are following new use-cases categories affordable for everyone: (1) Driving/Flight Simulators; (2) Remote operation/supervision of autonomous vehicles/robots; (3) Crowd/Immersive enhanced experience, study & decision taking (for social and commercial occasions, e.g. retail, real-estate, traveling); (4) Visual aid by augmented intensification of visual stimuli for people with diabetic retinopathy, glaucoma and neurological diseases.

The enablement of peripheral vision in VR triggers demand also for advanced XR such that AR pass-through imaging should expand standard 100° of reality toward 240° (supported by VRDom) or even to 360° (for alerting for events happening out of human visual perception). The expansion of AR pass-through FoV demanding very high pixel-stream bandwidth, where, gaze tracking helps to define new type of AR pass-through bandwidth regulation called "foveated imaging". Gaze tracking together with facial muscles tracking allows avatar reproduction of real facial expressions and emotions and enables "Real Reality". Below - left, click to learn more about VRdom enabled use-cases and below-right click to learn possibilities to configure XR/RR 240.GEN2 headset according to your application demands (including pre-ordering non-consumer version of VR240.GEN2 glasses ).

Peripheral vision stimulation makes possible new applications in VR demanding situational awarness

Future headsets based on VRdom architecture will start from basic VR with 3DOF to most advanced RR with 6DOF

The new solutions are bringing new challenges and opportunities. The pancake optics is challenging VR displays industry since the pancake approach is wasting 75% of display light energy versus classical VR lens solution , creating power-consumption/heating and brightness trade-offs. Hypervision is participating in OPTIMAL EU consortium developing advanced Laser based micro-fabrication and responsible for VR use-case to improve displays light efficiency by free-form micro-optical arrays concentrating and directing each display pixel energy into eyebox of the user. The same approach is also useful for any personal display from smartphone to PC monitor that allows privacy along with more brightness and less power waste. Click-right for more information. 

OPTIMAL EU Laser microfabrication consortium:

1. Boosting light efficiency for VR/AR displays

2. Privacy solution for personal monitors

Proprietary Motorized Camera Testbed (MCT) emulates human eye physiology for VR/XR development & demo 

The wide FoV VRdom optical architecture (see App-note with the details) demanded special software layer stitching between central and peripheral visual modules. The stitching should be "seamless" in wide eyebox that is function of IPD and Eye Relief (ER) deviation and eyes rotations. To debug and demonstrate the VRDom architecture and software we developed own Motorized Camera Testbed (MCT) with multiple camera modules (fish-eye and zoomed and with pupil-aperture regulation) and Z, X and rotational motorized movements to emulate IPD, ER and gaze of the eye. Our MCT is also could be adjusted to any VR/XR headset to specify visual fidelity ת FoV and Ghosts to Signal ratio (G2S) for pancake lenes and could be used as a platform for comparative analysis between different VR headsets. Click-left for more .

bottom of page