Note: This is an archived version of the Blender Developer Wiki (archived 2024). The current developer documentation is available on developer.blender.org/docs.

User:PeterK/GSoC2020/Proposal

Name

Peter Klimenko

Contact

Email: [email protected]

Github: github.com/PeterKlimk

IRC Nickname: pkzmbk

Synopsis

The aim of this project is to implement and make use of VR controllers as input devices in Blender, so an artist can use VR as a part of his workflow.

I propose to add support for the OpenXR action system to Blender, as well as ray-cast based selection of objects (the idea site mentions GL_SELECT, but this is deprecated and doesn’t work well with VR). I am also aiming to add support for 3D input for some tools such as the Grease Pencil as well.

Benefits

VR is becoming an increasingly popular medium, and it is important that artists who create content for VR make use of VR as a part of their workflow, to work with, see and experience the content as the end-user would. Making use of the controllers is vital for this. If Blender aims to be a comprehensive and fully-featured open-source replacement for proprietary workflows, and even lead the field, VR controller support would be a key feature. This is recognized in Blender’s 2nd VR milestone.

Deliverables

By the end of the project, Blender would support:

● VR input through OpenXR

● Python query support for controller state

● Ray-casting for object selection using the controller.

● (Potentially) 3D controller-based controls for select 3D tools. I aim to have at least one completed to demonstrate the usefulness of VR in a 3D workflow. Any additions would be well documented.

Project Details

The first step would be to add support for the OpenXR action system for hapics and control, building upon the work done to get OpenXR working in Blender. This involves the creation of XrActionSet’s at the start of the OpenXR XrSession, and creating input XrAction’s representing all the possible user interactions and also output XrAction’s (for haptic output). Bindings are then created for these actions, using XrPath objects with paths such as

"/user/hand/{left|right}/output/haptic"

It is worthwhile to create a comprehensive set of unique actions, even if they have the same binding by default, as this allows the user to bind whatever VR controls he likes to whichever action.

In Blender, the appropriate “pose” for hand controls would probably be the “aim” pose, accessed using

"/user/hand/{left|right}/input/aim/pose"

This gives us a ray representing where the user is pointing with his motion controllers (or tracked hand). This needs to constantly be accessed, every drawn frame while in VR, for a smooth experience.

Python Querying of VR state in Blender scripts would be implemented by extending the Blender Python API. I would add operators to access information such as controller state, position, ray, e.t.c. This could be implemented globally and/or within the context of a VIEW_3D Area to get information such as the object a user is “aiming” at.

Implementing object selection would make use of the “aim” ray given by the “aim” pose and ray-casting, as GL_SELECT is depreciated and finicky in VR. There is already an existing implementation of ray-casting for scenes in Blender, so this should be relatively straightforward. It is probably necessary to keep track of the current “pointed-at” object each frame and draw an outline, as this is relatively cheap and makes the user experience more functional. When the user uses the trigger (activating the “select”) action, the selected object would be changed to the “hovered” object.

Lastly, if time allows, support for using tools such as the Grease Pencil in VR as a 3D input method would be attempted. This would essentially have the same functionality as the grease pencil used with a 2D mouse, except instead of querying the mouse cursor and transforming that into a 3D position based on the camera, we get a 3D position from the “aim” pose action. This functionality is comparable to Google’s “Tilt Brush” demo.

Schedule

Bonding Period : Necessary research, including of OpenXR.

June 1 - June 29 (4 weeks): Implementing support for the OpenXR action system.

June 29 - July 13 (2 weeks): Adding python queries.

July 13 - July 27 (2 weeks): Implementing raycast-based object selection.

July 27 - Aug 3 (1 weeks): Further profiling, testing, feedback as well as notes

Aug 3 - Aug 31 (4 weeks): If time left, implementing 3D controls for Blender tools.

Otherwise finishing the project. Also, adding required notes and documentation not yet added during the previous weeks.

I had to suddenly move because of the Coronavirus situation in my country, which delayed my application. On the other hand, as a result of the situation all of my University content has moved online and become pass/fail and I am also no longer working. Practically, this means I am locked down at home, doing little except for coding.

I am therefore free to give 100% to this project, working an average of 5-6 hours or more every day if need be.

Biography

I am a 21 year old student studying Math and Computer Science at UNSW in Sydney. I’m an avid user of open source software and libraries and have submitted bug reports, but I have yet to submit PRs for serious projects. I feel as though GSOC is my opportunity to give back to the open source community. Relevant skills:

● I own a VR headset (Rift S) and am experienced with VR and VR controls.

● Have used Blender in a hobbyist capacity for years, am familiar with the basic controls.

● Am also doing a math degree, helps with the math required for computer graphics.

● Proficient in C, C++ and Python.

● Have actually compiled Blender from source and investigated the codebase (also tested basic VR with blender_oculus).

I have a number of private github projects, some of which I have just made public. This includes:

● A multithreaded Rust program for finding particular kinds of collatz sequences

● A tokenizer and AST based interpreter for a (terrible) toy programming language

● Basic minecraft-style block rendering (w/ water support) written in Rust using Vulkano-rs. Turns chunks of blocks into meshes and occludes the hidden geometry.

● A delaunay triangulator implementing SweepHull with various optimizations, making use of AVX intrinsics.

I have used Blender for a while and been continuously impressed with its improvements, from cycles to the newest UI update. Hobbyist rendering of such things as bouncing slimes and collections of falling balls was a big source of entertainment for a much younger me and cemented my love of computer graphics, affecting my choices in University. I would love to give back to the project, and am particularly a big fan of the open-source ethos of collaboration.