Documentação

Tipo de documentação

OK

Cancel

Linguagem de programação

OK

Cancel

Ferramentas

OK

Cancel

Filtros aplicados

This is a Helm Chart* for TensorFlow* Serving.

This is a GNMT FP32 inference model package optimized with TensorFlow* for bare metal.

This is a BERT Large BFloat16 inference model package optimized with TensorFlow* for bare metal.

This is a BERT Large BFloat16 inference container optimized with TensorFlow*.

Intel® Connected Logistics Platform (Intel® CLP) was developed to offer real-time asset tracking solution for the logistics industry.

Use TensorFlow* performance Jupyter* notebooks to analyze the performance benefit of Intel® Optimizations for TensorFlow*.

TensorFlow* and Intel® oneAPI Deep Neural Network Library (oneDNN) , Scikit-learn*, and Intel® Distribution for Python*.

This is a Faster RCNN Int8 inference container optimized with TensorFlow*.

This is a Faster RCNN FP32 inference container optimized with TensorFlow*.

Wireless Network Ready Intelligent Traffic Management is designed to detect and track vehicles and pedestrians and estimate a safety metric.

XGBoost

XGBoost is a scalable, portable, and distributed gradient boosting (GBDT, GBRT, or GBM) library for Python*, R*, Java*, Scala*, C++ and more.

This is a Wide & Deep Large Dataset Int8 inference model package optimized with TensorFlow* for bare metal.