Virtalis Unveils Visionary Render V1.2
[ Back ]   [ More News ]   [ Home ]
Virtalis Unveils Visionary Render V1.2

Dec 23, 2015 -- Virtalis, one of the world’s leading Virtual Reality (VR) companies, is previewing Version 1.2 of its Visionary Render software.

Technical Director, Andrew Connell, said: “I am exceptionally proud of what the Virtalis R&D Team have managed to pack into Version 1.2, which has been completed only a few months after the well-received Version 1.1. We made improvements right across VisRen’s usability and functionality from increasing the number and diversity of controllers to adding gesture recognition and post-processing effects. In further developing our ground-breaking VR software, which now has enhanced support, Virtalis is grateful to its beta testers: Raytheon, AMRC, Rolls-Royce and BAE.”

Visionary Render allows users to access and experience a real-time, interactive and immersive Virtual Reality (VR) environment created from huge 3D datasets. Users can work alone, in small groups, or collaborate with distant colleagues in a common virtual environment to perform detailed design reviews, rehearse in-depth training tasks, validate maintenance procedures or verify assembly and manufacturing processes.

“The changes made to VisRen V1.2 respond to the demands of industry”, commented Rab Scott, head of VR and Simulation at AMRC and Nuclear AMRC. “The ability to mark up a virtual model for later review and discussion is a positive and powerful step forward. V1.2 features new tracking capabilities and runs both faster and smoother with noticeably more artistic rendering, giving a rendering capability unusual in the sphere of engineering. I’ll go so far as to say that the combination of VisRen V1.2’s rendering capability with its high polygon count handling places it in a unique space within VR software.”

“With a more intuitive GUI and restructured interfaces”, Rab continued, “VisRen V1.2 acts as a combination of an MCAD package with VR software. The capacity that VisRen V1.2 has to connect the virtual scene to The Internet of Things and Big Data sources has the potential to open up a whole new world for us.”

Headline Advances

Ability to add text / image notes to Assemblies in the 3D scene.
New Tools tab, “Review”, to edit and interact
Annotations stick to their target when moved or animated
Can annotate immersively or via Lua scripts

Physically Based Shading (PBS), bringing materials closer to their real life representation
Atmospheric Scattering
Screen-space reflections

Finger Tracking
Full finger tracking support for both hands
Pinch based object manipulation
Two handed avatar

New Immersive Drivers
LeapMotion, CyberGlove, ART Finger Tracking, 5DT glove, CyberTouch, VRPN and Oculus SDK 0.7

Gesture Recognition
Ability to record tracked gestures and poses with any tracked device
Gestures record wrist movement
Preset Undo/Redo gestures

Post Processing Effects
New library of pre-configured pixel shaders, Camera Motion Blur, FXAA, Depth of Field and Edge Detect which can be applied using a View Filter object in the Scene.
Editor that allows users to program their own shaders using the GLSL language