banner



How To Use Htc Vive Body Trackin G For Maya Animation

Cull your operating system:

The goal of scale is to have the virtual and physical cameras match up in the virtual world.

AdjustingAlignmentStep.png
At runtime, the game needs to know a few things, like:

  • The type of photographic camera and lens that you're using.

  • Where your camera is in relation to your virtual scene.

And to composite your scene, the game needs to know things similar:

  • The colour of your blush backdrop (usually green).

  • How aggressively to cutting out the chroma color.

  • Areas you want fully cutting out from the camera frame.

The game needs to know these things then the virtual camera can emulate the physical one. This is where the scale tool comes into play. The scale tool lets you configure all of these settings at one time, and produces a calibration settings file with that information. The settings file tin can and so be reused across different games.

Pre-Calibration Setup

Before you lot can first a Mixed Reality capture, you volition need the right equipment. Here is quick overview of the equipment you will need, and some tips on how set it upward. This include the basic light-green screen and camera setup for the capture process, as well every bit the equipment and software needed for calibration.

Light-green Screen and Camera

MR_GreenScreenSetup.png

  1. Video Camera
    UE4 only supports a very specific ready of video capture devices; run into Supported Video Devices for the current list of the supported devices. Using a device from the listing, connect information technology to your PC for streaming.

  2. Chroma backdrop
    For chroma keying, we more often than not use a light-green screen. When setting up the green screen you lot will want to make sure it's taut, minimizing any wrinkles, particularly behind your subject area. If you are setting up lighting, you will desire to make sure not to cast shadows directly behind the subject. You want a shine, flat color. The more shades of green that at that place are beyond the backdrop, the harder information technology is to blush key. It helps if the discipline is far abroad from the backdrop as possible. If you plan on filming the bailiwick's feet, consider a light-green screen for the flooring too.

  3. Photographic camera Mount
    For initial setup (calibration), your photographic camera needs to be static. If y'all are using a webcam this may be as elementary as attaching information technology to your desk/monitor. Another option is mountain the camera to a tripod.

  4. Multi-Mount + Tracker (Optional)
    If yous programme on moving the camera effectually while filming, yous will desire attach a tracking device to the photographic camera, such as an HTC Vive Tracker. In addition, connect the camera and tracker together with a multi mount, and so that they stay firmly locked in place.

Calibration Specific Equipment

At that place are a few additional items required specifically for the calibration process (equally opposed to the capture process).

MR_RequiredSetupItems.jpg

  • HTC Vive or Oculus Rift
    The calibration tool requires one of these ii VR systems to piece of work. The tool uses the controllers' tracking to evaluate the location of the photographic camera. If this is the first fourth dimension y'all've fix an HTC Vive or Oculus system, you will need to consummate their setup procedure offset for calibration to work properly.

  • Printed Checkerboard Pattern
    Impress and attach a printed checkerboard pattern to a flat rigid surface (like a slice of paper-thin). Y'all can find an case of this checkerboard pattern included with the calibration tool download.

    When taping downwards the checkerboard printout, make sure non to cover the checkerboard pattern itself with tape (even clear tape can cause specular reflections that make it hard for the camera to read).

Steps

Once the equipment is set, you tin can download the calibration tool from here. Once you've downloaded the file, unzip it and launch MRCalibration.exe.

The scale process is divided into distinct steps. The individual steps are discussed farther below.

  1. Device/Camera/Tracker Selection

  2. Lens Calibration

  3. Alignment Calibration

  4. Alignment Adjustment

  5. Compositing Calibration

  6. Garbage Matting

During scale, at the stop of each step your progress is saved. So you can get out and render to the tool if needed. Once the scale process is consummate, the tool generates a settings file that you volition and so apply to commencement a MRC session in other projects.

Your scale progress is recorded in the setting file. If you want to restart the scale process, you will demand to delete the settings file. The settings file is named MrcCalibration.sav, and can be institute under the tool's /Saved/SaveGames/ directory.

Shared Controls

While each step has its own distinct controls, there are some controls that are used consistently through the tool:

  • Enter : Next/Ostend/Accept/Submit

  • Terminate : Skip (just applicable when step requirements take been met - some steps can't be skipped)

  • P/Thumbstick : Preview

  • Yard : Mirror the video feed

  • R : Reset altered settings

  • Esc : Exit

The tool is rough around the edges, and usability tin be improved in places. The tool is functional, just we programme on improving the usability of information technology in after releases. We welcome your feedback!

Calibration Procedure

ane. Device/Camera/Tracker Choice

If you have multiple video capture devices connected to your PC, you have to specify which one yous want to use. Employ the Up/Down keys to cycle through them.

MR_DeviceSelection.png

Additionally, each photographic camera may have multiple formats and resolutions bachelor to it. We try to auto-select the best format and resolution, but you lot can use Left/Correct to change the selection.

The resolution of the camera feed does not affect the output resolution of the Mixed Reality capture. The output resolution of the capture is controlled by the game project yous will be capturing from. The resolution of the photographic camera feed just controls how clear the feed is within the scene.

Device Pick Controls

  • Up/Down : Select video capture source

  • Left/Right : Select video capture format

  • Tab : Select camera tracker

Tracker Selection

If you're planning on moving your camera while filming, you will need to attach information technology to a tracking device.

Before standing to the next step, you can utilize the Tab primal to wheel through the available tracking attachments. For the HTC Vive, the offset tracker will be named "Special_1" in the attachments listing.

Later you have made the appropriate selections, press Enter and continue to the next step.

2. Lens Calibration

Different camera lenses curve and distort the picture in unlike ways (think of a fisheye lens). Y'all don't desire that baloney to be carried over into the captured scene, so utilise this lens calibration step to figure out how to undistort the prototype. MR_LensCalibration.png

During this step, you will demand a printout of the checkerboard epitome that you downloaded along with the scale tool. In this footstep, the scale tool continuously takes screenshots using the checkerboard pattern every bit a baseline.

Hold the printout at different locations in front end of the camera. Utilize dissimilar angles and depths, specially effectually the edges of your frame, and so that the calibration tool gets a wide collection of samples. The text at the top of the screen will change to let you know when the calibration tool has collected enough samples.

If your camera is fix to motorcar-focus, you might find it helpful to disable auto-focus during this footstep.

Reprojection Error

Afterward the calibration tool has nerveless plenty samples, the tool will testify you a preview of the undistorted feed. If the feed looks acceptable, press Enter to continue. If needed, you can add more samples, or press R to reset and kickoff over.

A good way to tell if the undistortion process is working is to find elements in frame that should be direct (wall corners, doorways, etc.). Lens distortion will often warp straight edges (peculiarly towards the edge of the frame).

MR_ReprojectionError.png

The reported "reprojection error" lets you know how accurate the procedure thinks it is. The lower the reprojection error value, the meliorate. You're in great shape if the value is under one!

Angle of View

Probably the most important utilise of this step is to approximate the camera's angle-of-view (or FOV, 'field-of-view').

The FOV determines how much of the world the photographic camera tin see, and it is important that the FOV of our virtual photographic camera matches this every bit closely as possible.

If you already know the FOV of your camera (in degrees), you tin use the mrc.FovOverride panel variable to ready it.

Well-nigh USB camera manufacturers list the diagonal-FOV (DFOV) for their devices. However, we are interested in the horizontal-FOV (in degrees). You can compute the horizontal-FOV from the diagonal-FOV with the below formula, using the height (h) & width (w) of the resolution y'all selected in the previous step.

MR_FOVFormula.png

There are also ways to summate the horizontal-FOV for DSLR lenses.

If yous do not know your camera's specific FOV offhand, don't worry, this step will approximate it for you.

If you have a camera with an adjustable lens, you volition want to be very conscientious not to adjust the zoom after you've completed calibration. Adjusting the zoom will change the physical camera's field-of-view (FOV), only non the virtual one. The virtual camera will use the FOV that was used during scale. If yous arrange the photographic camera's zoom after calibration, you will need to repeat the calibration procedure.

3. Alignment Calibration

AdjustingAlignment2.png

This footstep requires you to rerun the Align and Adjust phases until complete.

  1. Align the controller with an on-screen model.

  2. Adjust the sampled alignment.

  3. Repeat.

It is important to cover your headset's sensor, so that we tin get accurate tracking information from your controllers.

For the HTC Vive, brand certain that the controllers are on earlier you launch the tool. Otherwise, the calibration tool may not display models to align with (the tool needs to recognize the type of controllers that y'all are using).

Phase 1 - Align w/ Model

On screen, there will be a pinkish/fuschia model to align your controller with. Hold the controller up to the screen and pull the trigger when the controller and the model are lined up.

In one case you've pulled the trigger, the screen will freeze frame, and switch into the next phase.

Phase ii - Adjust Sample

Don't expect to be wholly accurate when you first pull the trigger (shaky easily, etc.). In this phase, you adjust the last sample so it is more exactly aligned in the frame.

Phase 3 - Echo

In total, in that location are 11 points for you to marshal with. You lot don't have to get through all of them, if at any signal you lot think the alignment is satisfactory.

image alt text

Adjustment Controls

  • Up/Downward : Move the model vertically

  • Left/Right : Move the model horizontally

  • Num +/- : Zoom the model in/out

  • Alt + Up/Downwardly : Rotate the model vertically (pitch)

  • Alt + Left/Right : Rotate the model horizontally (yaw)

  • Alt + Num +/- : Rotate the model left & right (roll)

  • H : Hide the model & its outline

  • Alt + H : Hide the model only (leaving its outline)

  • P : Preview alignment

  • Enter : Have alignment

Preview the Result

As you take samples, the screen will show a series of colour coded icons (a cross and target). How close each pair is gives you an idea how shut the alignment is. You want all pairs to be as close as possible, though don't worry if one is wildly off (it may have been an inaccurate sample).

You can preview how the alignment is coming together by holding P (or the thumbstick). If yous're happy with the results and wish to skip the residual of the alignment points, press the Cease key.

MR_PreviewResults.png

Video Lag:At that place will likely be some lag between the video feed and controller tracking. Information technology can make it hard to tell if the ii are properly aligned. To combat this, you tin can use the console variable, mrc.TrackingLatencyOverride to introduce filibuster so that they're more in sync.

iv. Alignment Adjustment

Up to this point, you've but adjusted the alignment one sample at a time. This single sample gives the calibration tool a practiced approximation of the camera's position, simply the calibration is likely ameliorate in some corners of the frame than others.

Click to play animation

In this stride, in that location are v white boxes on-screen. Movement i of your controllers to each of the boxes and pull the trigger in each.

Yous want to perform this process at the aforementioned depth (distance away from the camera) that you plan on filming from.

Each box will disappear when it'southward taken a sample for that region. The model tracking with your hands needs to be fully inside the box for it to take a sample.

Making Adjustments

Once you've hit all 5 boxes, the screen volition change to a split-screen like view.

MR_MakingAdjustments.png

Like in the previous alignment process, move, rotate, and calibration the models to match the epitome.

Adjustment Controls

  • Up/Down : Move the models vertically

  • Left/Right : Move the models horizontally

  • Num +/- : Zoom the models in/out

  • Alt + Upwards/Downwardly : Rotate the models vertically (pitch)

  • Alt + Left/Correct : Rotate the models horizontally (yaw)

  • Alt + Num +/- : Rotate the models left & right (coil)

  • ```` : Suit FOV up & down (careful)

  • R : Reset adjustments

  • P : Preview alignment

  • Enter : Accept adjusted alignment

You can run into how adjusting/optimizing for ane sample tin can throw off the others.

Rotating in this footstep is a little difficult to wrap your caput effectually, equally you are moving all the models all at once in unison. Remember of them together as a unmarried model that you lot are rotating about their commonage middle. Try to go along the middle sample screen center, and align information technology start, using it every bit an anchor point.

five. Compositing Scale

In this step, y'all tweak specific compositing settings. This is the first step where you go to see the composited scene as you would in a Mixed Reality capture. You could exit at present, and everything would piece of work properly.

MR_CompositingCalibratrion.png

Employ the arrow keys to select and set specific values.

Chroma Fundamental Settings

Most of the settings in this footstep pertain to the chroma keying process which is discussed in particular in this UE blog mail service.

Setting

Description

ChromaColor

The color of your blush backdrop (generally green).

Luminance Power

Used to assistance separate the backdrop colour from shadows that may tint the visible color.

Blush Clip Threshold

Colors matching the chroma color upwards to this tolerance level will be completely cut out. The college the value the more that is cut out. A value of nix ways that the chroma color has to exist an exact lucifer for the pixel to be completely transparent.

Chroma Opacity Forcefulness

Scales the opacity of the remaining pixels. The higher the number, the less translucency there will exist. The closer a color is to the ChromaColor, the higher you lot have to turn this upwards to keep information technology from going transparent.

Despill Strength

Scales the despill intensity. A value of nothing ways no despill correction will be performed.

Despill Cutoff Cap

Colors that differ from the blush color beyond this tolerance level volition be unaffected by despill correction. The higher the value, the more than that will be color corrected.

Despill Sharpness

Used to polish out the despill slope - defines the despill falloff bend. Scales the despill forcefulness exponentially. The lower the value, the more than subtle/smooth information technology volition exist.

Fake Bounciness Intensity

Used to scale the strength of the color used to supercede the color removed as part of the despill process. A value of zero means that the faux bounce color won't exist applied.

Faux Bounciness Colour

A colour to supercede the chroma colour bleed that was removed as role of the despill procedure with.

Other Compositing Settings

Setting

Description

Tracking Latency

Depending on your video capture device, the video feed may be laggy compared to your controllers. With this setting, you can introduce a filibuster to the controllers, in society to improve sync the ii.

Depth Showtime

By default, the video aligns to the depth of the headset. This is used to decide what renders in front of and behind the subject. This lets yous offset that frontward or backward in the scene.

Pressing Enter cycles from one setting to the side by side. Once you're satisfied with the configuration, striking End to save. You can always first the tool once again to tweak these values later.

6. Garbage Matting

Garbage matting is the procedure of blocking out areas that should e'er be cut out of the video feed.

If your greenish screen doesn't cover your entire frame, you will need to consummate this additional process to cut out those excess regions from the capture.

This is the concluding stride of the calibration process. If you lot don't have this problem or a need for it, you can skip this step and exit.

MR_GarbageMatting.png

This is the but stride in the tool that requires you to work in VR. Put your headset on, and the controls should be listed in front of you.

Garbage Matting Controls

  • Cease/Esc : Relieve & go out the tool

Left Controller

Correct Controller

  • Grip : Alt (hold to change controls)

  • Trigger (Hold & Drag) : Movement gizmo

  • Thumbstick : Undo

w/ LGrip held

  • Trigger : Reset gizmo position

  • Thumbstick : Redo

  • Grip : Create matte model

  • Trigger : Select/Deselect matte model

  • Thumbstick : Adjacent gizmo manner (rotate, translate, calibration, etc.)

w/ LGrip held

  • Trigger : Deselect all

  • Thumbstick : Prev. gizmo way

Click to play animation

Similar to how our VR earth editor works, you place and position 3D planes. Match the planes up with the real-world sections y'all want cut out of your capture.

In that location'southward a preview window in VR, so you tin can see the results as you modify the mattes. Yous may want to to move the camera effectually at this signal (if it's attached to a tracker), to see how information technology looks from unlike angles.

Getting the mattes aligned perfectly can be a difficult process. If you are having problem, endeavor standing where you want the mat to be, and bring information technology to yourself. Use the flick-in-picture to help guide your progress.

Once you've disordered out the areas you want, y'all can exit the app (Finish or Esc) - all of your settings will be saved. If necessary, you lot can start the tool support once more to tweak the settings.

End Result

One time y'all've finished the calibration process, you can exit the tool. In the tool's Saved/SaveGames/ directory you should discover a MrcCalibration.sav file.

Click to play animation

In one case you take aMrcCalibration.sav file, copy it into your game's/Saved/SaveGames/ folder. If your game doesn't take a SaveGames directory yet, you will demand to create information technology manually.

If you've gone through this process one time, y'all don't need to do it over again (as long as your setup doesn't alter). You can reuse the same calibration settings file across different UE titles (titles that accept enabled the MRC framework plugin).

Source: https://docs.unrealengine.com/4.27/en-US/WorkingWithMedia/IntegratingMedia/MixedRealityCapture/MRHowToCaptureCalibrationTool

Posted by: carmichaelwassix.blogspot.com

0 Response to "How To Use Htc Vive Body Trackin G For Maya Animation"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel