Python* API Reference for Intel® Data Analytics Acceleration Library 2020 Update 1

Public Member Functions | List of all members
Result Class Reference

Provides methods to access final results obtained with the compute() method. More...

Public Member Functions

def serializationTag
 
def getSerializationTag
 
def get
 
def check
 
def allocate_{Float64|Float32}
 
def __init__
 
- Public Member Functions inherited from Result
def serializationTag
 
def getSerializationTag
 
def __init__
 
def get
 
def set
 
def check
 
- Public Member Functions inherited from Result
def __init__
 
def getSerializationTag
 
def check
 
- Public Member Functions inherited from SerializationIface
def serialize
 
def deserialize
 
def getSerializationTag
 
- Public Member Functions inherited from Base
def __init__
 
- Public Member Functions inherited from Argument
def __init__
 
def __lshift__
 
def size
 

Detailed Description

Deprecated:
This item will be removed in a future release.

of the AdaBoost training algorithm in the batch processing mode

Constructor & Destructor Documentation

def __init__ (   self)

Member Function Documentation

def allocate_{Float64|Float32} (   self,
  input,
  parameter,
  method 
)

Allocates memory to store final results of AdaBoost training

Parameters
inputInput of the AdaBoost training algorithm
parameterParameters of the algorithm
methodAdaBoost computation method
Full Names
  • allocate_Float64 is for float64
  • allocate_Float32 is for float32
def check (   self,
  input,
  parameter,
  method 
)

check(Result self, Input input, Parameter parameter, int method) -> Status

def get (   self,
  id 
)

Returns the model trained with the AdaBoost algorithm

Parameters
idIdentifier of the result, classifier.training.ResultId
Returns
Model trained with the AdaBoost algorithm
def getSerializationTag (   self)

getSerializationTag(Result self) -> int

def serializationTag ( )

The documentation for this class was generated from the following file:

For more complete information about compiler optimizations, see our Optimization Notice.