Interactive d-sub modeling
flyingshapes held a Webinar to introduce the way in which its VR-based CAD software works. Here are my notes from it.
We are used to downloading demo software and minutes later working with it. 'Tis not the case with software like flyingshapes, because first you need VR goggles, naturally. Here are the ones supported:
- Oculus Rift and Rift S
- HTC Vive and Vive Pro
- Windows Mixed Reality
- Pimax
- Varjo HMD
Next you need a Windows 10 computer and then one with at least an i5 CPU, and then one that has one of the supported graphics boards:
- At least NVIDIA GeForce GTX 960 through to RTX 2060
- Or at least AMD Radeon R9 380 thru to RX 5600 XT
For weaker systems, some special effects, like shadows, can be turned off.
With the hardware in place, go to flyingshapes.com/download to register and then download the software. I don't have the hardware, so I didn't do that.
The UX
The user experience is nothing like you know from mouse-and-keyboard-based CAD. It is hyper-interactive. Just moving your head moves the viewpoint, kind of like 3D orbit mode.
A controller in each hand lets you grab objects and execute commands by pressing or holding buttons. Here is an overview of the interface:
The red-green-blue lines represent the x,y,z axes. You can see planar grids, but the software also offers a 3D grid with evenly-spaced crosses throughout 3D space.
The two controllers you hold in your hands are shown in front of you. To the side is a panel reporting the current x,y,z coordinates. You can zoom in and out, or else scale the model interactively.
Between the controllers is the gizmo, which lets you move and scale in each of the three directions.
The figure below shows the popup menu through which you access tools and apply materials. Another popup screen shows videos of how to use the tools. Next to each controller is a flyout indicating the active tool.
The two primary drawing tools appear to be freehand sketch and spline. Splines have control points that are manipulated interactively. Surfaces are stretched between boundaries. You can apply G1 curvatures between splines and surfaces.
As you press buttons on the hand controllers, a text tip appears reporting the action, such as Trigger Released.
The idea of flyingshapes is to do CAD modeling, but in a sub-d manner (interactive editing of mesh-like entities). They plan to move towards more CAD-like editing, which I think means a 3D-solids kind of direct editing.
Why VR?
I gotta say that what the folks at flyingshapes (from Germany) have accomplished appears impressive.
But is there a market? My argument has always been that humans are adapted to navigating by 2D in spite of living in a 3D world. Three-D CAD models are not just "one extra dimension"; they add five more sides to plan views.
There are 3D mice, which have not been overwhelmingly popular. There have been in my estimation only three advances in 3D modeling over the decades:
Multiple windows -- so that we can see more than one side of the 3D model at a time, with the drawback that the more windows we open, the smaller the model becomes in each.
Realtime shading -- so that we no longer look at puzzling wireframe models, trying to figure out what is outside, inside, front, or back.
Dynamic UCS -- so that the software anticipates the plane on which we want to draw.
The last one exposes the myth of 3D modeling: even in "3D", we still work in a 2D micro-environment.
Don't get me wrong: 3D modeling is important for efficiency; it's just that we are trying to shoehorn our 2D selves into a 3D environment that is virtual, and not one by which we experience the real world. The UX is the problem, not the 3D.
So, where does that leave flyingshapes. I see it as a test case to examine what is possible with CAD in an immersive 3D environment.
Comments