Dataguess
v0.3.0
v0.3.0
  • Welcome
  • Getting Started
    • Installation
      • Windows Installation
      • Linux Installation
    • Activation
  • REFERENCES
    • Workstation
    • Workgroup
    • Dataflow
    • Components
      • Core Concepts
        • Scheduler
        • Synchronizing Ports
        • Import / Export Ports
        • Forwarding Inputs
      • Inputs
        • MQTT
        • Modbus TCP
        • Siemens S7
        • OPC UA
        • TwinCAT
        • EtherNet/IP
        • Camera
      • Outputs
        • MQTT
        • Modbus TCP
        • Siemens S7
        • CSV
        • JSON
        • InfluxDB
        • OPC UA
        • TwinCAT
        • Data Watch
        • FTP Client
        • Image Stream
      • Function
        • Math & Logic
        • Array Functions
        • Image Functions
        • Type Conversion
        • To JSON
        • From JSON
        • Array
        • Timer
      • API
        • HTTP
      • Server
        • HTTP Server
      • Script
        • Python Script Editor
      • Ai
        • ONNX
        • Scikit Learn
  • Use Cases
  • FAQ
Powered by GitBook
On this page

Was this helpful?

  1. REFERENCES
  2. Components
  3. Ai

ONNX

Open Neural Network Exchange

PreviousAiNextScikit Learn

Last updated 3 years ago

Was this helpful?

Inputs and outputs are created automatically when the model is loaded. Various models can be uploaded and linked with other components. Inference device can be selected CPU or GPU(CUDA) as executor provider.

Load ONNX model from the file system.

ONNX's model input must be an array.