Čėėžńņšąöč˙ 0 FAQ

For convenience of search questions are united in thematic categories:

  1. Before purchasing
  2. At purchasing
  3. General questions
  4. Connectors and commutation
  5. Light, background and color keying
  6. Data and formats
  7. Options and configurations
  8. 3D scenes design
  9. Scenario scripts
  10. Hotset – operator interface
  11. Input and output video quality
  12. Sound configuration
  13. Typical problems
  14. Other questions
If you didn't find the answer to the question, address to department of technical support , or in forums on products. We recommend to study also documentation on virtual studios "Focus".

1 Before purchasing

1.1   I need to buy a Virtual studio. Where should I send request on its purchasing? What information should I provide with?

    An order to purchase a virtual studio can be directed to the sales department of our company sales@softlab.tv or info@d-graphica.com or any of the suppliers of our products. Before you make a booking, you need to decide which configuration of the Virtual Studio is required for the job. Brief description of the possible configurations can be found in the price list and on the relevant page of our website. 
      When ordering, specify the required virtual studio, with some types of input and output video you want to work: analog (S-Video, component) or digital (SDI, HDMI) in SD or HD. As can be seen from the price list, the type of operating video specifies the name of the studio and its cost. 
      The next thing that you must specify - how many of each type of video source you want to connect to the virtual studio. 
      In addition, you need to decide whether you want to synchronize the video output to the studio to any external source, because the option for a Genlock synchronization should be ordered additionaly. 
      It is also necessary to decide the method of connecting. When delivered to an analog studio usually attached patch cables for each of the boards, which is not always convenient for connections. To organize all connections securely and comfortably in a standard rack or on a table, you can purchase additional breakout box(-es). 
      Once you select all necessary components of the order, you can estimate by the list price the cost of your order of the studio, and then apply the order, stating your details in order to be able to start order processing.

1.2  How studio output signal is synchronized to external source (GenLock)?

As a device for external studio synchronization the output board of studio is used, with the activated one of its input interfaces for inputting a clock signal from an external source. To do this, in most cases, a separate input-output board in the appropriate format is used. In some cases it is possible to use multi-channel cards simultaneously for signals input and output, but it is necessary to have on the output board free at least one input channel (not used as an input for the studio). In any case, Genlock option purchased separately.

1.3  What is included in the Virtual studio delivery set?

Sorry, under updating.

1.4  Is it possible to export 3DS MAX virtual sets via software included into FocusLite virtual studio for further working with it?

In specification of Lite products is written “with no ability of designing virtual scenes”. It means that it is impossible to create projects for programs broadcasting by yourself. Lite studios are designed for playing of of virtual sets with live actors in ready projects. Projects are ordered at any designer that can work with full products from Focus set (in HotActions application).


2 At purchasing

2.1  Where a type of output signal (YUV, S-Video(Y/C) or others) is specified in virtual studio?

The studios Focus output signal type is selected in the application, configure the input-output cards, usually FDConfig2 (or FDConfig for boards FD300). Details setting boards are described in the document control and configure settings for FDExt series boards. The main features of the settings of various input-output cards to work in the studios of Focus are described in the hardware configuration.

2.2  Where may I get HotActions project samples?

Examples are supplied as standard on the hard disk system supplied (usually in the directory D: Focus_samples) and on the accompanying DVD-ROM. There's also recorded an original scenes for some examples of projects created in the 3DS MAX (for the full-featured version of the studio). 
If for some reason the examples are lost, you can contact the technical support department and agree a method of transmitting to you a copy of examples.


3 General questions

3.1  What is the main difference between Focus virtual studio and other studios?

The first fundamental difference between virtual studio Focus on other virtual studio is the ability to obtain high-quality results without the use of expensive tracking systems, camera and / or building a real scenery. Video images of live actors are transmitted into the studio from static cameras and then these images are integrated in the virtual scene created in 3D editor application (3DS MAX or MAYA) as usual textures. These video textures can be put over any 3D scene object: plasma, ball, car. Object can be modified in any way: cut, turned, bent and so on. It is impossible to do the same in other virtual studios where actors image is usually flat and is usually located between back and front backgrounds. You need a lot of space when shooting to move camera back in such studios. Focus studio provides a possibility to imitate camera moving from 1000 m distance in space of a not big size (2x3 meters) on green or blue background.

Another fundamental difference is the availability of tools for building operator interface for each particular scene (the application). For example, for the filming of the weather forecast in most cases the interface consisting of only 3-5 buttons that trigger actions appropriate stages of the shooting process. And the work will be only those buttons that are currently interpreted television production. This minimizes the possibility of human error in the process of shooting and use low-skilled staff in their everyday work. Education in this case many times easier than working with standard television equipment type mixers or traditional models of virtual studios.

3.2  What is HotActions?

HotActions is the name of the main Focus studio application.

There are three versions of the application: 
HotActions - full-featured application that includes tools for creating projects (virtual scenes and operator interfaces of studio) 
HotActions Live - player of complete projects with the ability to change a parameters of virtual scenes and interfaces defined by authors. 
HotActions Design - a tool for designers (developers) of virtual scenes and interfaces for remote work on any computer (without ability to work with video and audio signals).

3.3  Can I update software myself?
Support department may refuse in further supporting of the system after independent modification of the system (except samples updating). That’s why update software in accordance with support department and its recommendations.
3.4  I need to use additional video signals (video cameras) in studio. Additionally, is it possible to replace old PC components by new (e.g. video card)?
You need to purchase additional capture boards to work with additional sources if PC system board has free PCI slots. Boards will be sending to user and should be installed on one's own but in accordance with support department recommendations.

     Updating of studio components is made by its full replacement. New studio will be sent to user with temporary registration file. In some time (period till month) when work with new studio is mastered user sends back boards (FD300 and DVM) from previous studio. Our company then will provide him with unlimited registration file to work with new studio.
      For purchasing FD300 board(-s) (for further working with additional videosources) as well as studio components updating, appeal to support department and report your account details for our company to draw an invoice

4 Connectors and commutation

4.1  Connectors and commutation
(The section is under development)

5 Light, background and color keying

5.1  We want to make a background for chroma keying by painting the floor and walls in green or blue. What paint could I use for this aim?
Usual shop paint of blue or green color won’t fit because it usually has some specks (even mat paints) and has color unsuitable for keying even if surface painted with it looks like it is appropriate. There are specialized paints for keying (for example, of Rosco production and other). These paints can seem expensive but you get a perfect quality. It is better buying a paint of a good quality once than spending time later by solving problems with keying adjustment because of a bad quality paint. In case when background is not good for keying, support department and application properties can be helpless in assisting to adjust high-quality keying.
     Use priming before painting. Companies specialized on specific paints usually propose purchasing a prime. Though in this case specific prime is not always necessary, you may use any of a white color. Prime is covered by several layers of paint.
     Except paint you may use folding portable background or specific fabric (fixed as a moving curtain) for keying or specific linoleum.
      Remember that quality of light (there should be a lot of light and it should be evenly distributed) and quality of video signal are very important for good keying.
5.2  Can you give recommendations on selecting of a light?
It is necessary to involve engineer estimates to light your studio professionally. In general, 2 types of lamps (differ from each other) are used: luminous tube light for background and projectors for actors. Distribute light evenly on the background (floor also) and take away specks on actors by back light. Don’t forget about main light.
Examples of light arrangement can be found here.

5.3  Why there is a ripple on actor’s contour when keying is adjusting?
Often typical problems appear when “actor’s” colors is similar to “background” color (blue or green). It happens in case of low-quality light, poor background (e.g. if it’s too much grey), inappropriate match of actor’s clothes and so on. Moreover, camera with its settings can spoil even good conditions in shooting area. Sometimes, for example, it is enough to switch on “sharpening”.
     Mask border (transparent/non-transparent) is very sensitive towards any clutters in source signal. If colors of actor and background are too similar then keying becomes a kind of clutter intensifier – there is no opportunity to create smooth transition of a color from actor to background. Background color is a transparent one but adjacent color (the color of actor) is not transparent. In the limit a transparency mask is reduced from 256 possible gradations of grey into bit black-and-white (0/255). In this case even the slightest noise in the source signal results into a mask chattering.
     Our recommendations are: adjust shooting conditions, camera settings and signal processing so that colors of a background and an actor differ as much as it possible. Note that white- grey-black colors are always practically present in actor’s colors. In addition, spot of background colors on histogram should be as small as possible (absence of gradients and color mixtures) and as far from the centre as it can be (black-and-white colors).

6 Data and formats

6.1  How can I learn frames number, frame size and codec of compressed video file?
This information could be found in Summary tab of the Properties, dialog opened via video file context menu in Advanced display mode. Width, Height, Frame rate, Video compression values are shown in this tab.

More information on video file formats can be found with MediaInfo and GSpot utilities.
6.2  What is the difference between video file playing via the FILE_* and MSDS_* streams?

(Actual for HotActions 1.6)

The difference is in mode of playback (order of fields displaying) specified by default. In case of playing via FILE_* the first low field (FORMAT LFF) is shown by default. When playback is implemented via MSDS_* the order of fields displaying is defined automatically (FORMAT AUTOFIELDS).
Take into account that at specified mode of automatic detection of fields display order for MPEG2 files the information on fields order is taken from the file itself. Detection of order for all other files types depends on availability of information in the system. If there is no such information video file will be playback in the progressive scanning mode but not in the interlaced one. That is why specify a mode of fields display: LFF, UFF or NOFIELDS when playing of video files with interlaces scanning is implemented.

6.3  When imitating tracking in/out of virtual camera to plasma monitor integrated into scene and playing of video clip, it is possible to notice that the image on monitor is blurred and bounds of monitor are blinked. Is it possible to improve image quality?

(Actual for HotActions 1.6)

If playback interlaced video files in small scale one of the fields is not displayed and a blink effect appears (size of a pixel). Blinking can be reduced via a mechanism of trilinear MIP mapping (FORMAT MIPMAP). In other words it is recommended to use commands
RENDER.VIDEO.VideoStreamName.FORMAT = MIPMAP - to turn mechanism of image filtering on,
RENDER.MATERIAL.MaterialName.MIPMAPLODBIAS = fBias - specifies a degree of filtering.
As the problems are connected with video file playback replace VideoStreamName by MSDS_i or FILE_i. The choice depends on what stream is used for file playback. Also, replace MaterialName by material name where stream is displayed.
     Remember that this filtering results into image blurring especially in case of high factors values. That is why we recommend to select a factor value experimentally so that there are no blinking and at the same time for the image not to be blurred.


7 Options and configurations

7.1  Is it possible to use external devices to operate virtual studio? For example, run of script commands at contacting of this device? Or to use Tally control of active video cameras?

(Actual for HotActions 1.6)

Chapters 15 of the Focus Virtual Studio. Using the script commands. User’s Guide ?contains information on working with external devices via GPI. Device for transmitting of GPI signals can be purchased at our company (appeal to sales team via sales@sl.iae.nsk.su), or at suppliers of our company production. You may get detailed information on Forward GPI Box device at our web-site page. The price is indicated in price list.
      The Forward GPI Box device with USB connection is designed to connect two input (controlling) and two output (controlled) GPI signals via BNC connectors. After connection to USB operation system informs that a new device is found and it needs to install drivers. Information on Forward GPI Box drivers can be found at site of this device chip developer. After drivers installation two virtual COM ports will be imitated in PC ports list (Ports(COM&LPT)).
You can control these ports by commands described in chapter 15 of the Focus Virtual Studio. Using the script commands. User’s Guide. Install control module for GPI commands (VSGPIControl.dlf) to support its commands in the HotActions application. Its installer is included in studio installation set on disc D: in the D:FocusVS_InstallVSGPIControl folder. If it is absent in the supply set, appeal to support vrset@sl.iae.nsk.su.

7.2  Is it possible to work with video signal and adjusted keying without 3D scene creation? For example, can I place keyed actor on any background (graphic image or clip)?

(Next answer is actual for HotActions 3.0)

No, without 3D scenes virtual studio "Focus" does not work. But in a standard set of examples of virtual scenes, going to each delivery, the special scene to work with simple backgrounds - pictures and videos is included. Work with this example is very simple and intuitive to use (in the settings of the scene you specify the names of the required files for the background).

(Next answer is actual for HotActions 1.6)

The C:Program FilesForwardT SoftwareKeyConfig folder contains FDKeyConfig.exe and FDKeyConfigPro.exe applications for working in this format. Program interface is quite simple and can be quickly mastered. Here is user’s guide on dealing with these applications. if some difficulties appear appeal to support.


8 3D scenes design

8.1  A plane for an actor in the scene in 3DS MAX occupies required position. Why when exporting it to HotActions it is declined? How export the actor to virtual studio in normal position?
Direction of local coordinate axes of a created object may not coincide with axes directions in HotActions. "Z" axis can be directed not upward, "Y" axis can be directed not perpendicularly to the back part of the actor. To correct this, select the "Local" mode in the "Reference Coordinate System" list ,
Click the "Hierarchy" icon , and then the "Affect Pivot Only" button Check directions of the "Pivot" axes for the actor’s plain.

If they do not coincide with required ones turn them by pressing the "Align to World" button.
8.2  Does the number of scene triangles created in 3DS MAX influence a quality of an output image? If yes, what is a limit of scene triangles for the image to remain stable, uniform and of a high quality (without frames skipping)?
Quality of studio output image displaying depends on many factors of scene creating in 3DS MAX itself and a further creation of the project on its basis in HotActions. If calculation and display time of a frame exceeds frame time (40 ms in the PAL format) the image won’t be displayed at all, i.e. frame will be skipped. Calculation and display time depends not only on number of scene triangles but mainly on number of video streams (LIVE_* and MSDS_* (FILE_*)). Frame skipping depends on, for example, number of light sources in the scene, number of objects, its texturing, scene complexity, hierarchical structure, scripts queue in the project that operates scene objects. Without taking into account all these influences it is possible to assign approximately 1000000 of scene triangles with simple non-textured objects without light sources for supplied in studios GeForce 480 video cards. Given value is an approximate and conventional one.
      To take into account all mentioned factors when displaying is being adjusted it is recommended to switch load indicator on by pressing the Shift + ~ buttons. Indicator data is described in chapter 7 of the Hot Actions User’s Guide.
      As optimizing creation of the scene and project requires some skills we recommend you to appeal to the Support department vrset@sl.iae.nsk.su. If you send by e-mail or through FTP your project and/or 3DS MAX scene you will get detailed instructions on studio optimization.
8.3  Is it possible to imitate tracking in and tracking out by means of virtual studio if a camera directed towards an actor is the static one?
Any scene created for virtual studio in 3D graphics programs (3DS MAX, for example) must include at least 1 virtual camera through which the scene is visible in studio. If the camera is animated all objects in its field of view are automatically zoomed when playing of animation depending on camera distance. A live actor of virtual studio is an object - a plane where an image from real static camera is placed. This object-plane is also zoomed with other scene objects at animation playing and it makes illusion of camera motion.
      For detailed information on preparing of corresponding scene components in 3DS MAX, see the Creating 3D scenes. User’s Guide. Description of commands for scene objects operation, i.e. replacement of texture by actor’s image, playback of animation objects etc. is described in the Script Commands. User’s Guide.
8.4  How can I work with distant and short shots in scene?
Static cameras are used for shooting in virtual studios. For actors displaying, for example, in half and full-length positions work with 2 cameras. Each camera is connected to studio and a keying table for its image is created. Correspondence of virtual cameras with static ones is set by commands for scene objects operating described in the Script Commands. User’s Guide. Virtual cameras can be directed at different scene views. Images switching from virtual cameras in the scene with a simultaneous switching from real ones is made via commands for controlling.
      If you imitate tracking in actor (image from a static camera) through virtual camera animated in 3DS MAX then after this imitation you can improve a quality of actor image using the 3D Overlay mode. This mode is intended for dealing with large actor images close to full-screen. Image of static camera is embedded in virtual decorations without zoom varying, "texel-in-pixel". Thus image quality is not lost. The 3D Overlay mode has limitation: impossibility to play animation of a virtual camera directed towards an actor. That’s why you should first imitate tracking in and then toggle to this mode. Reverse actions are implemented in corresponding succession: first switch off the 3D Overlay mode, then imitate tracking out. More details on 3D Overlay you may find in the Script Commands. User’s Guide, section 7.3.
8.5  What dimensions of an actor video texture must be if I plan to switch the 3D Overlay mode on after imitation of tracking in actor? Probably due to scaling off actor’s dimensions are changed a bit and at 3D Overlay on a noticeable leap occurs, i.e. the actor “jumps”? How can I eliminate this effect?
In case if virtual camera for displaying of actor in the 3D Overlay mode is not animated it is enough to inscribe rectangle with video texture into view angle of virtual camera. It is supposed that in this case 3D Overlay won’t be off. It means there won’t be any “leaping” caused by switching.
      Sometimes it is necessary to switch the 3D Overlay mode on smoothly, for example, when animated virtual camera is tracking in video texture with actor image and with further switching on of this mode. In this case exact inscribing of rectangular with video texture into virtual camera vision is necessary. Specify exact location of virtual camera relatively to rectangle with video texture when creating scene in 3DS MAX. The following conditions must be satisfied:
  1. Laterals rectangle lengths should be correlated as 4:3 (16 : 9 for HD);
  2. Virtual camera must be exactly directed towards rectangular centre and be perpendicularly towards plane;
  3. 3. Distance from camera to rectangle is estimated by camera visual angle (FOV) and rectangle dimensions.

8.6  How can I imitate objects shades in scene?
Objects shades are not supported in virtual studio. But distribution of illumination (including shades) can be easily imitated by creating in advance materials with rendered distribution of light. Then you need only to assign prepared materials to objects. Distribution of illumination in 3DS MAX (beginning from 5.0. version) is made via the Render to textureoption from theRendering menu. Remember that memory of graphic accelerator is limited and exceeding may cause frames skipping. That is why this imitation should be applied only to the most spectacular textures. Don’t forget to optimize dimensions of these textures (section 2.1.10 Creating 3D scenes. User’s Guide).
      Recommended settings for rendering of textures are presented on the picture below. We offer to use automatic dimensions adjusting of the textures with the Use Automatic Map Size option. Note that not in all 3DS MAX versions this option works correctly. Sometimes you need to control result visually and correct dimensions manually.


8.7  How can I imitate objects reflections in the scene?
There are 2 ways of reflections imitation:
  1. place specular objects copies under “semitransparent” floor (a plane with a transparent material as a floor;)
  2. place object copy (its reflection) on reflective surface and specify not a high degree of transparency (Opacity) for this copy material.

8.8  Why at rendering of some objects certain defects, e.g. prominent black borders, are seen in scene?
Such defects may be observed if objects with transparent material(s) are in scene. Order of objects sorting when they are displayed on the scene differs for transparent (assigned material with alpha channel) and opaque objects (assigned material without alpha channel). Sort order of different objects in scene is described more in details in section 3.6 of the Script Commands. User’s Guide.
Calculation of scene in every frame is made anew with sorting of spheres centres that envelop objects. At moving of objects on the scene if the object with transparency material is very extensive, centre of the sphere that envelopes this object would be located behind the centre of the sphere that envelopes the object with opaque material. Thus, object in some frames can be invisible. In this case we recommend prefixing the name of an object with a symbol (for detailed information on this issue, see chapter 8 of the Creating 3D scenes. User’s Guide). This solution is not a quite efficient one because some other image defects may still occur (e.g. shades, borders on contours and so on). Better way to display large objects with transparent materials correctly is its segmentation into small parts. Segmentation must be done according to selection method observing objects with transparency to be displayed correctly.
Supplied set of samples contains the example of similar segmentation (Interview sample). The sample represents a floor plane (FLOOR) in the Interview.3d scene organized as a mosaic of 6 segments (Floor1a, Floor1b, Floor2, Floor3, Floor4, Floor5) with identical Floor material.
8.9  How can I make image more realistic in virtual studio without increasing of graphic processor load?

(Next answer is actual for HotActions 1.6)

Possibilities of graphic accelerators are persistently broadened but rendering quality in real-time mode will be always worse comparing with slow but of a high quality renderers. This problem can be solved by using prerendering of animation and/or 3D scene textures.
The simplest (and quite typical) example of animation prerendering is virtual tracking from general on close actor’s view of 3D TV studio model. In this case all camera passage is rendered in animated clip (without limitations on use of any light sources, types of materials, effects, plugins and so on). Camera itself with animation and rectangle for actor will be exported to virtual studio format. Material of rectangle will be later replaced by actor image from a real camera.In virtual studio prerendered clip with virtual camera passage will be synchronically played on scene background material (BGND). So, the result is tracking on “live” actor in a qualitatively rendered environmen.

  • When exporting in studio format select (Select) all elements in virtual studio necessary only for working in virtual studio. Click the ”Export Selected” option on the exporter panel.
  • For outputting of the image in virtual studio use the same camera as is used for prerendering.
  • When rendering of animation clip you may use all features of 3DS MAX, plugins and external applications. Though exported in virtual studio part of the scene must not contain materials, processes and so on that are not supported in virtual studio. The following items are important:
    1. 1. Frame rate in a pre-rendered clip must be:
      ?) for PAL – 50 frames/sec;
      ?) for NTSC – 60 frames/sec.
    2. It is recommended to use the SoftLab-NSK MPEG2 I-Frames (MPEG2) codec
    3. .
    4. Output file format must be –«Progressive» (“Render to Fields” should be “OFF”)
    5. .
    For example, you may have the following settings when rendering from 3DS MAX:

    The following commands should be present in studio in initializing Action:
    :RENDER.VIDEO.MSDS_1.DATA = “prerendered_animation.avi”

    These commands create MSDS stream, specify media file (in our case this is a clip with prerendered animation) that will be played through it and assign this stream to scene background. Material for background with the BGND name is created automatically when exporting of the scene in virtual studio. In general, it is not obligatory to assign the MSDS stream to the background. It is more reasonable sometimes to assign it to rectangle located in front of the virtual camera. Rectangle should be linked to virtual camera and it is desirable to switch the Overlay mode on for rectangle material. In Action where clip with prerendered and 3D scene animations are synchronically started settings of a current camera, stream launch and scene animation must be present, e.g.:
    :TRACK.Intro.START = 0,180

    Current camera should be the same that was used for rendering of the «prerendered_animation.avi» animation clip.

          Sample of the project with prerendered animation is included into supplied samples set beginning from the VSHotActions_160 version. It is located in the following folder: D:FocusVS_SamplessunVSset_103. Initial 3DS MAX scene is in the D:FocusVS_Samples_Sources folder (VSSource_sunVSset_SD_3d.zip and VSSource_sunVSset_SD_render.zip


9 Scenario scripts

9.1  When switching to camera with Action the image from camera is not displayed in scene immediately. First a still image (assigned to the object-actor in 3Ds Max) flashes and only then the image from camera appears. How can I eliminate this flash?

(Next answer is actual for HotActions 1.6)

Most probably there are commands for video stream creating and formatting in Action of the button that you press, for example:
Executing of these commands takes considerable time. Still image of the actor(from 3Ds Max) will be displayed during this time. It is recommended to execute such commands in initializing Action that is usually named as Init in projects.
      The start of video stream
and the replacement of material by image from camera
commands can be also executed in this Action.
      Sometimes it is necessary to start stream in a separate Action. In this case you need to insert the command of start waiting between these commands
There will be some time for the stream to start and only then the image from camera will appear in scene. Otherwise still image will be flashing.

9.2  When I switch the Overlay mode on via the ACTION.START = "OVERLAY_ON" command in the Action of switching of the camera to the actor the image from camera in scene is still blurred. Why Overlay does not respond?

(Next answer is actual for HotActions 1.6)

The Overlay mode is activated via the
RENDER.MATERIAL."necessary material for the actor".OVERLAY=1 command.
Most probably this command is assigned to another material in the OVERLAY_ON Action that you activate via the ACTION.START = "OVERLAY_ON" command. In this case you should switch Overlay for the actor on via the
RENDER.MATERIAL."necessary material for the actor".OVERLAY=1 command
but not the ACTION.START = "OVERLAY_ON" one.
To identify for what material Overlay is on firstly click the Init All button in the application with the loaded project. Then open a context menu of the Action for which Overlay must be on by right-clicking. Select and execute the Test command in this menu.

Action will be executed. Necessary image from camera will be displayed in the Render Output window.
      After this when pointing the cursor to the image from camera in the Render Output window names of object-actor and material (in brackets) will be displayed in the left lower corner of the application.
The command of the Overlay mode activation in this example will look like RENDER.MATERIAL.actor1.OVERLAY = 1.

9.3  What scripts are used to switch on Tally indicator of camera that shots?

(Next answer is actual for HotActions 1.6)

First you should create a signal in virtual studio via the GPI.OUT.Name.CREATE = GPI_OUT_DEVICE_NAME command, e.g.
where: OUT1 is a user-specified signal name;
COM2 is a port name where controlling device is connected;
1 is a port output index.
The command of creation can be executed as a separate Action script or as a command among other scripts of initializing Action Init All. Transition of the output GPI signal (i.e. switching on of Tally indicator) is implemented by executing of the GPI.OUT.Name.SET = 1 command, e.g.
Switching of signal transition off is made via the GPI.OUT.Name.SET = 0 command, e.g.

9.4  What scripts are used to execute some actions on switching GPI signal on from controlling device in virtual studio?

(Next answer is actual for HotActions 1.6)

First you should create a signal in virtual studio via the GPI.IN.Name.CREATE = GPI_IN_DEVICE_NAME command, e.g.
where IN1 is a user-specified signal name,
COM3 COM3 is a name of port where controlling device is connected,
0 is a port input index.
The command of creation can be executed as a separate Action script or as a script among other scripts of initializing Action Init All. Further actions in studio on switching of input GPI signal on are implemented via the SYS.WAIT = Event commands of waiting. Detailed information on these commands can be found in chapter 8 of the Focus Virtual Studio. Using the Script Commands. User’s guide). As it is described in chapter 15 of this user’s guide when input signal is on the SYS.EVENT = “GPI.IN.Name.1” message will be displayed in the Debug Output window. Analogous message appears when switching the signal off. The text of these scripts must be used in wait commands. The scripts look like, e.g.:
ACTION.START = "Green backdrop"
Action with these commands should be executed. Then the command of message waiting SYS.EVENT = "GPI.IN.IN1.1" appears in commands queue. When this signal is on, the following command - ACTION.START = "Green backdrop" will be executed, i.e. "Green backdrop" Action begins to be executed.
To execute different actions when different signals are being switched on it is necessary to:

  1. create several signals with different names;
  2. execute Actions with commands of waiting of these signals to be turned on.

All these commands appear in queue. Planned action will be executed when one of the signals is on. For example, let it is necessary to create 2 signals:
and to execute corresponded Actions:
ACTION.START = "Green backdrop"
ACTION.START = "Blue backdrop"
When the first signal (IN1) is on the "Green backdrop" Action will be executed. In case with the second signal (IN2), the "Blue backdrop" Action will be executed if this signal is on".

However execution of these commands performs necessary action only by single turning on of the signal. To execute actions in further by turning of the signal on it is necessary to execute wait commands, i.e. set them in queue for performing.
Necessary actions will be executed in cases when GPI signals are on by operators if to add in Action commands the Action start itself. In this case Action execution will be looped.
For example, if you name Action with
ACTION.START = "Green backdrop"
commands as Action1 it will look like
ACTION.START = "Green backdrop"
SYS.WAIT = "ACTION.Green backdrop"
ACTION.START = "Action1"

where SYS.WAIT = "ACTION.Green backdrop" is a process of waiting of "Green backdrop" Action completion.


10 Hotset – operator interface

10.1  (the section is under developing)
(the section is under developing)

11 Input and output video quality

11.1  Why output studio image is blurred?

(Next answer is actual for HotActions 1.6)

If the image is quite large and almost a full-screen one its accuracy can be increased by switching of the Overlay mode on. The mode is switched on via the RENDER.MATERIAL."necessary material for the actor".OVERLAY=1 command. For more information on switching of the Overlay mode on see either 7.3 section of the Focus Virtual Studio. Using the Script Commands. User’s guide or the answer on the question of the previous section.
      If software of the 1.62/more new version is installed and image outputting is implemented without DVM use add the RENDER.FLICKER = fFlicker command to the initializing Action (the command is not included into prompts). Then vary the fFlickervalue (on the right) by setting firstly 0.4 and after that increasing/reducing the value. In this way you may set required degree of image blurring.
      Also, check settings of the video card. We recommend you to select the mode of 4x anisotropic filtering and 32x antialiasing parameter in NVIDIA Control Panel settings for the GeForce GTX 480 cards (produced by NVIDIA) supplied for virtual studios.
      To set these options pass to the Manage 3D settings tab in the 3D settings section (left part of a dialog settings window).
To set Antialising - Settings - 32x select at first the Override any application setting in the "Feature" Antialising – Mode drop-down list.

The antialiasing mode in virtual studio will be set according to the mode specified in NVIDIA settings (32x as recommended above) at selection of the Override any application setting property for the Antialiasing - Mode. The antialiasing mode set by rendering settings in the Render Options dialog (detailed information is given in the answer to the next question) will be ignored.
If to select the Application-controlled property for Antialising – Mode in NVIDIA settings the antialiasing mode will be ignored and target image will be antialiased using parameters specified in theRender Options dialog (see the answer on the next question 11.2).
      If you use the DVM board for outputting you can try to vary a value of the Flicker parameter in DVM module settings (Options dialog (F10) > Format button of the Output group > DVM tab > Output section).

11.2  When tracking in and out an actor (playback of animation in virtual studio scene) objects borders become stepped. Is it possible to smooth them?

(Next answer is actual for HotActions 1.6)

First you should check if a mode of smoothing is specified in video card properties correctly. Instruction on how you can do this is the answer on the previous question.
Secondly, antialiasing mode can be configured in the Render Options dialog opened by pressing the F11 button in HotActions. Recommended mode of antialiasing is 7 for Multisample Quality with selected 4_SAMPLES for the Multisample Type.
(At selection Application-controlled for the NVIDIA Antialiasing mode setting. Detailed information is given above).

Detailed information on rendering settings can be found in section 8 of the Hot Actions User’s Guide.
      You can also execute the RENDER.FLICKER = 0.4 command in initializing Action to reduce the effect of stepping borders when moving in scene. This command is absent in prompts. To use it you should type it in Action. To get necessary blurriness of image vary initial value on the right.

11.3  How can I eliminate or reduce effect of thin objects flickering in scene (“jarring”)?

(Next answer is actual for HotActions 1.6)

You can vary value of the Flicker parameter in the DVM module settings (Options (F10) dialog) > Format button of the Output group > DVM tab > Output section). Remember that varying of this parameter influences a degree of objects blurriness. It can unfavourably affect actors look on the scene.
      If you work in the 1.62 version or higher and the DVM board is absent you may use the RENDER.FLICKER = fFlicker command for the flicker filter settings. Execute the command in initializing Action. (Detailed information is given above).


12 Sound configuration

12.1  What is different in audio settings in versions 3.x and 1.x of HotActions?

HotActions 1.x versions work correctly with sound only on FD300 boards, it uses hardware audio processing built into the board. HotActions 3 works with both boards FD300, and with a FDExt series of cards and some 3-rd party cards, without built-in sound processing (hardware mixers). For sound control the HotActions 3 uses the standard actions and controllers, embedded in a user interface of virtual scenes. More detailed information is given in  the document for setting up the sound (sorry, only in russian at the present moment).



13 Typical problems

13.1  Why some problems occur sometimes when HotActions project samples are launched?
Folder with the project file (*.vsp) usually contains the Readme.txt file. As a rule, versions of software components with which the project works without errors are written in this file. Also, there can be described specificities of hardware configuration necessary for working with the project, e.g., number of connected input video streams (LIVE_* or FD300 boards in the system). Make sure that hardware configuration of virtual studio corresponds specified one, i.e. all hardware and software components of required versions are installed.

14 Other questions

14.1  How can the actor imitate leaning against a virtual table in scene when shooting?

Before creating of 3D scene in graphic editor (3DS MAX as a rule) think over it. The most preferable solution is to place the actor at a real table. In this case the actor in virtual scene will be presented as a person and a table. Keying will be adjusted for this complex and embedded into virtual decorations. Also it is possible to create a situation with an actor sitting at prepared module (specially painted for keying) with its further embedding into the scene at virtual table. Embedding in this case should be thoroughly thought in advances by designer(creating 3D scene). All specificities of actor’s location should be taken into account. Note that if a person being shooted takes anything in his/her hands (a sheet of white paper, for example) probably this object will be perceived as a background and it will be processed at keying because of reflected background catchlights on the object. Owing to this processing on the sheet taken by the actor being shooted it is possible translucence of 3D virtual decorations.

            © Copyright 2014 D-Graphica. All rights reserved     info@d-graphica.com