Welcome to TensorPort! TensorPort is a machine learning platform that helps you train and deploy models easily and enables you to collaborate with others on projects. TensorPort is designed for machine learning with TensorFlow or Keras (using TensorFlow backend), so you will need to know at least one of these in order to use the platform.
Within TensorPort you will work with Projects and Datasets. A project is a repository of code containing one or more machine learning models, while a dataset is a repository of labeled data; both can be collaboratively added, shared, edited, versioned, and more. You can train and test your models through Jobs, which are executions of a project on some dataset(s) using TensorPort’s distributed infrastructure of CPUs and GPUs. Multiple datasets can be run with the same project and vice versa, allowing maximum flexibility for experimentation.
You can access TensorPort in three ways: an online graphical interface called Matrix, a command line interface (CLI) outside of your web browser, and directly through the TensorPort API. Most tasks are carried out through combined use of Matrix and the CLI. TensorPort uses a REST API, with a GraphQL version in development. Your actions through any of the interfaces will be reflected in the others and you can always switch between them when carrying out tasks. For example, you could run a job from the CLI then pause and analyze the job in Matrix.
Help and Examples
If you are just getting started with TensorPort, you may want to use our tutorials for some of the most common tasks on the platform. Soon, you will also be able to look at our examples page, which will outline several possible use cases, or browse our explore page, which will list some of the premade models and public datasets available to you on TensorPort.
If you have questions or encounter any bugs, you can contact us at any time through Intercom (the icon in the bottom right of the website or Matrix). You can also use Intercom or email firstname.lastname@example.org with inquiries or requests for features. We’re happy to work with you to ensure we’re providing the tools your team needs.