Technology & Research
The first technology developed by Hypervision was VRDom architecture that is stitching central and peripheral visual modules for each eye. The VRDom was initiated due to the market request to resolve limitation of the consumer VR headsets that is 90° by extension to 270° horizontal FoV (the field of view we humans are used to use). After getting feedback that "horizontal FoV extension is great, however, what about vertical FoV", we developed our GEN1 based Hybrid Aspheric Fresnel lens, where concave, dual-side Fresnel part of the lens allowed to fill completely human vertical FoV (up to 170°). The GEN1 based VRDom was exposed on AWEXR in Nov-2021 and positive feedback was received by VR developers, while feedback from VR headset companies was that compact form-factor (sunglasses size) VR headset is expected. Therefore, during 2022 we developed GEN2 based pancake lens that is also compatible for VRDom technology and those days (from Q2-2023) introducing the GEN2 technology to the market.
While the GEN2 is answering to the current VR market demands, in order to have solution for the future demands we are doing R&D to resolve an additional optical challenges :
Freeform micro-optics for VR displays light efficiency boosting and personal monitors privacy & energy efficiency. Beside R&D there is a challenge in prototyping / manufacturing that will be done through EU OPTIMAL consortium that is developing advanced Laser manufacturing know-how for micro-structures.
GEN3 Holographic optics to make VR display-lens module almost flat
To further empower the GEN2 based sunglasses form-factor, full human FoV VR there is demand to use AR pass-through imaging. The AR pass-through should be expanded from standard 100° of reality toward 240° (supported by VRDom) or even to 360° (for alerting for events happening out of human visual perception). Therefore, we started R&D to make full human FoV AR-pass-through imaging with visual fidelity equivalent to human super-vision (40/20 on visual acuity chart).
The expansion of AR pass-through FoV demanding very high pixel-stream bandwidth, where, gaze tracking helps to define new type of AR pass-through bandwidth regulation called "foveated imaging". Gaze tracking together with facial muscles tracking allows avatar reproduction of real facial expressions and emotions and enables "Real Reality" and we are developing the gaze and mimic tracking based on micro-cameras and image processing algorithms.
The VRDom architecture works with 4x MIPI DSI displays while the XR and RR imaging work with more than 10x high resolution / high frame rate MIPI-CSI cameras and there is demand for AR-pass-through "zero" latency to transfer pass-through videos displays. Since there is no standard SoC with all amount of needed interfaces we are developing XRR270 co-processor in FPGA that is also could be converted into ASIC. The XRR270 coprocessor will work with VR SoCs and will also has PC thunderbolt connection.