Developer Guide

  • 2021.2
  • 06/11/2021
  • Public
Contents

Developer Workflow Overview

Before using the data streams optimizer and cache allocation tools on the same system, see the instructions in Compatibility between Data Streams Optimizer and Cache Allocation to avoid possible technical and performance issues.
There are two main phases to working with the tool:
  • The
    preproduction environment
    is expected to be the same as the production environment minus the machinery. The interfaces to the machinery are typically simulated (for example, send a signal for movement, sense actual movement that occurred). In addition, the preproduction environment is expected to have testing, debugging, and analysis tools which are not necessarily a part of the final product. Typically all capabilities are tested in the preproduction environment to confirm their functionality, and only then deployed to the production environment.
  • The
    production environment
    refers to the computing hardware and software you will employ to execute your use case that results in revenue generation. An example is an Intel processor with Linux* OS and software to perform parts manufacturing using lathe machinery.

Preproduction Phase

You will use the tool in your preproduction environment to identify a suitable tuning configuration and validate it provides the needed tuning.
For accurate tuning, the target system must be a working non-production environment that matches the production environment setup including hardware, software, and the applications that run on the machine. The end result of the preproduction phase is to arrive at the final tuning configuration that you will deliver to your customers.
At a high level, the preproduction steps are as follows:
For validation, you will need to provide two types of applications and two files, as described below:
  • Test workload
    : This term refers to your real-time application or any combination of applications that can potentially validate different aspects of performance, such as latency and power consumption. The test workload confirms the data streams optimizer can sufficiently optimize the target system to meet your requirements.
  • Workload validation script
    : The workload validation script runs the test workload you created and validates the performance requirements in a way that the tool can read. This script is one application that the tool will interface with. The script must be able to run your test workload and return 0 if your requirements are met and 1 if your requirements are not met. A description of how the tool uses the script is provided in the steps below.
  • Requirements file
    : The Requirements file is a JSON file that contains requirements for data streams, such as latency and bytes per transfer, which the tool uses to select a tuning configuration. You can choose to tune one or more streams listed in Data Streams Supported.
    If you plan to use software SRAM with the data streams optimizer, ensure that the
    SoftwareSRAM
    compatibility option is enabled in the requirements file before using the data streams optimizer.
  • Environment file
    : The Environment file is a JSON file that contains information needed to automate the tool workflow.
Steps:
  1. Confirm that the Data Streams Optimizer Setting is enabled.
  2. Create the required inputs for the data streams optimizer. These include the test workload, workload validation script, requirements file, and environment file. The tool uses scripts that automate various aspects of the tool flow. The scripts support the target environments described in Default Setup, but you can customize them for other OSes or firmware. For more information about customizing these scripts, including examples of scripts for Windows OS*, see Custom OS/Firmware Support for Data Streams Optimizer.
  3. Run the data streams optimizer with the input files. The rest of the workflow is automated by the tool.
  4. The tool finds a tuning configuration based on the information in the requirements file. The tool signs the tuning configuration and wraps it in a
    capsule
    . A capsule is a binary used to change a system’s tuning configuration by updating certain areas of the firmware, known as “subregions.”
  5. The tool applies the tuning configuration capsule to the target system.
  6. The tool runs your workload validation script. The script tests whether the workload executed successfully having met your performance requirements, and returns success or failure.
  7. If the script returns failure, the tool selects another configuration. The tool repeats the configuration and validation process until the requirements are met or the tool can’t identify any other configurations.
  8. If the requirements are met, the tool generates a tuning configuration file that specifies the selected configuration. You will use the file to clone the tuning configuration to your production systems.
The tool can’t provide a register-level description of the tuning configuration because some registers are proprietary. Instead, the tool provides a description of the tuning configuration in the form of messages. These messages describe the functional components affected by the tuning configuration, such as disabling C states or prioritizing traffic between PCIe and memory. For a complete list of messages, see Tuning Configurations.

Production Phase

After selecting the final tuning configuration in the preproduction phase, you will need to apply the configuration to the production environment.
At a high level, the steps are as follows:
Steps:
  1. Confirm that the Data Streams Optimizer Setting is enabled.
  2. Get the tuning configuration file from the preproduction phase. The file contains the capsule.
  3. Run the data streams optimizer with the tuning configuration capsule as input.
  4. The tool applies the tuning configuration to the target system.

Security

The Data Streams Optimizer subregion in the BIOS capsule must be signed to preserve the integrity of the subregion data. If the subregion data signers aren’t the subregion data producers, a trusted relationship between the signers and producers must be established to ensure that the subregion data doesn’t contain malware before signing in order to prevent any supply chain attacks.
In
preproduction
environments during your experiments with performance, you can use test keys created during Intel® TCC Tools installation or generated by yourself. Keys are used by the Capsule Create Script.
In
production
environments, you must use securely generated product keys as described in the white paper Intel® Time Coordinated Computing (Intel® TCC) Security for UEFI BIOS.

Next Steps

Start with the reference implementation described in Host-Target Reference Implementation.

Product and Performance Information

1

Performance varies by use, configuration and other factors. Learn more at www.Intel.com/PerformanceIndex.