The command line interface for Gradient - Gradient is an an end-to-end MLOps platform

Overview

GitHubSplash

Gradient CLI

PyPI Downloads


Get started: Create AccountInstall CLITutorialsDocs

Resources: WebsiteBlogSupportContact Sales


Gradient is an an end-to-end MLOps platform that enables individuals and organizations to quickly develop, train, and deploy Deep Learning models. The Gradient software stack runs on any infrastructure e.g. AWS, GCP, on-premise and low-cost Paperspace GPUs. Leverage automatic versioning, distributed training, built-in graphs & metrics, hyperparameter search, GradientCI, 1-click Jupyter Notebooks, our Python SDK, and more.

Key components:

  • Notebooks: 1-click Jupyter Notebooks.
  • Workflows: Train models at scale with composable actions.
  • Inference: Deploy models as API endpoints.

Gradient supports any ML/DL framework (TensorFlow, PyTorch, XGBoost, etc).


See releasenotes.md for details on the current release, as well as release history.


Getting Started

  1. Make sure you have a Paperspace account set up. Go to http://paperspace.com to register and generate an API key.

  2. Use pip, pipenv, or conda to install the gradient package, e.g.:

    pip install -U gradient

    To install/update prerelease (Alpha/Beta) version version of gradient, use:

    pip install -U --pre gradient

  3. Set your api key by executing the following:

    gradient apiKey

    Note: your api key is cached in ~/.paperspace/config.json

    You can remove your cached api key by executing:

    gradient logout

Executing tasks on Gradient

The Gradient CLI follows a standard [command] [--options] syntax

For example, to create a new Deployment use:

gradient workflows create [type] [--options]

For a full list of available commands run gradient workflows --help. You can also view more info about Workflows in the docs.

Contributing

Want to contribute? Contact us at [email protected]

Pre-Release Testing

Have a Paperspace QA tester install your change directly from the branch to test it. They can do it with pip install git+https://github.com/Paperspace/[email protected].

Comments
  • Some parameters in job_client.create() don't work

    Some parameters in job_client.create() don't work

    I'm following the sample docs at gradient.api_sdk.clients.job_client. I can't get working_directory or job_env to work.

    working_directory: auto set to /paperspace. When I pass in /app, it's ignored & /paperspace is still used (listed Console > Job > Environment; and on the Job model fetch from job_client.list()). I just cashed in and set all my Docker stuff from /app to /paperspace and it worked. Ideally I wouldn't even want to pass /app, but it'd just be picked up from WORKDIR in the Dockerfile (which is /app)

    job_env: I can't find a workaround (save putting a config.json in /storage). Parameter gets gets ignored; of note, the docs has a typo:

    job = job_client.create(
        ...
        job_env={
            'CUSTOM_ENV'='Some value that will be set as system environment',
        }
    )
    

    'CUSTOM_ENV'='..' should be 'CUSTOM_ENV': '..'. Not nit-picking; just that it had me second guessing my approach, since using a dict gets ignored ([j.job_env for j in job_client.list()] => [None, None, ..]) so I tried json.dumps(my_dict), still ignored. Any suggestions on getting the job_env passed in?

    Overall, it seems like some params are respected, and others not; and it's hard to know which is which. Maybe a mismatch on the docs vs Github vs PyPi?

    opened by lefnire 8
  • Unable to install due to numpy==1.19.4

    Unable to install due to numpy==1.19.4

    Hi I am unable to use gradient on python 3.7.9 due to RuntimeError caused by numpy==1.19.4

    RuntimeError: The current Numpy installation ('c:\users\XXXX\envs\gradient\lib\site-packages\numpy\init.py') fails to pass a sanity check due to a bug in the windows runtime. See this issue for more information: https://tinyurl.com/y3dm3h86

    I tried downgrading the numpy, but it appears to be incompatible and i got the ff ModuleNotFoundError: ModuleNotFoundError: No module named '_curses'

    opened by thompsonalecbgo 7
  • Update Readme.md

    Update Readme.md

    I found two outdated or currently not working commands while trying to execute my container:

    1. The --project option was ignored but providing --projectId worked.
    2. The example for the --command option had as error message: Error: Missing argument "SCRIPT...". Upon providing a script the originally provided command was ignored. Instead I had to use the --shell option to execute the correct call.
    opened by nziermann 6
  • Keep artifacts for canceled jobs

    Keep artifacts for canceled jobs

    I am aware this is probably not the place to open this issue, but it seems like artifacts are not kept when a job is canceled. But why? I just finished a 2 days running job and wanted to have access to my best model, but the artifacts are not showing up potentially because I canceled the job. If this is the case, this is very frustrating.

    opened by MichelML 5
  • feat(notebooks): enable basic notebook lifecycle commands PS-13680

    feat(notebooks): enable basic notebook lifecycle commands PS-13680

    Newly Supported Commands

    • notebooks stop
    • notebook start
    • notebook create
    • notebook fork
    • notebook artifacts list

    Related tickets

    https://paperspace.atlassian.net/browse/PS-12219 https://paperspace.atlassian.net/browse/PS-11559 https://paperspace.atlassian.net/browse/PS-13681 https://paperspace.atlassian.net/browse/PS-13682 https://paperspace.atlassian.net/browse/PS-13683 https://paperspace.atlassian.net/browse/PS-13684

    This PR follows up on the PR for PS-12219 (https://github.com/Paperspace/PS_API/pull/1526). It enables createNotebook, startNotebook, artifactsList and forkNotebook commands with the v2 endpoint. It adds in the ability for creating a notebook with vm_type_id or vm_type_label

    opened by kevin-kabore 4
  • Failing to start notebook

    Failing to start notebook

    I'm trying to start an instance of an existing notebook: gradient notebooks start --id [id] --machineType Free-P5000

    and get in return: Failed to create resource: Cluster null not found

    specifying any cluster ID doesn't change anything and the output remains the same.

    A bug ?

    bug 
    opened by macsunmood 3
  • CR2-22 CR2-49 CR2-48 add metrics list to sdk

    CR2-22 CR2-49 CR2-48 add metrics list to sdk

    Adding list custom metrics functionality to sdk and cli for jobs, experiments, deployments, and notebooks

    QA Test Plan:

    1. in cli, check that this new list command shows up in the help text: gradient deployments metrics --help
    2. create a deployment (gui or cli doesn't matter) and wait for it to finish provisioning
    3. try the new list command: gradient deployments metrics list --id XYZ, should return custom metrics (a list of words),
    4. repeat 1 and 2 for jobs, experiments, and notebooks. these might return null but that's okay for now, just checking that this new command is available. note the stuff that returns null below if any and I'll solve that in a separate ticket.
    released 
    opened by robghchen 3
  • Fix: Serialize jobenv to envVars PS-15020

    Fix: Serialize jobenv to envVars PS-15020

    Fixes serialization of job environment

    Test Plan: Create a job and specify jobEnv parameter. Use command "env" to see if job environment is affected

    released 
    opened by paperspace-philip 3
  • When providing --ignoreFiles comma separated the code returns Attribute error.

    When providing --ignoreFiles comma separated the code returns Attribute error.

    The command below with the --ignoreFiles field set

    gradient experiments run single node \
    --name test \
    --projectid 12345 \
    --container paperspace/tensorflow-python \
    --machineType V100 \
    --command 'python main.py' \
    --ignoreFiles "file1,file2,file3,folder1,folder2,folder3"
    

    Returns the following error

    file_paths = self._retrieve_file_paths(workspace_path, ignore_files)
    File "/lib/python3.7/site-packages/gradient/workspace.py", line 59, in _retrieve_file_paths
        exclude += ignored_files.split(',')
    AttributeError: 'list' object has no attribute 'split'
    
    opened by EvDuijnhoven 3
  • Notebook example with SDK concepts

    Notebook example with SDK concepts

    Example of using new SDK functionality to create projects, experiments, analyze model, and create an inferrence deployment.

    Tested on TensorFlow 2.0 & Python3 container image.

    opened by dte 3
  • Replace faulty chunk counting logic

    Replace faulty chunk counting logic

    PR #384 for allowing multipart uploads introduced a bug that prevents datasets from being uploaded that contain files that are multiples of 500Mb. This became more of an issue with #389 which changes the chunk sizes to 15Mb, meaning that any file that is a multiple of 15Mb (75Mb in my case) will cause the dataset to fail to upload.

    What I believe is happening is that because we faulty logic instructs us to read an additional block of data from the filesystem that doesn't exist. How this appears in my experience is that the CLI hangs indefinitely, or in a Workflow it crashes.

    released 
    opened by fmorlock-tt 2
  • Failed to execute request against storage provider when uploading a dataset

    Failed to execute request against storage provider when uploading a dataset

    Dear everyone,

    I am trying to upload a new version of my dataset, but I keep getting Brokenpipe errors. Here is what I am doing :

    gradient datasets versions create --id ...
    gradient datasets files put --id ...:... --source-path "."
    
    Failed to execute request against storage provider: ('Connection aborted.', BrokenPipeError(32, 'Broken pipe'))
    

    Uploading to other cloud providers is working, but not here.

    By the way, is gradient-cli uploading everything again at each new version? Or is it comparing hashes to avoid unnecessary file transfers?

    Thanks, Clément

    opened by clementpoiret 0
Releases(v2.0.6)
Owner
Paperspace
Paperspace
organize your books on the command line

organize your books on the command line

Ben Winston 19 Jan 21, 2022
Bear-Shell is a shell based in the terminal or command prompt.

Bear-Shell is a shell based in the terminal or command prompt. You can navigate files, run python files, create files via the BearUtils text editor, and a lot more coming up!

MichaelBear 6 Dec 25, 2021
A fantasy life simulator and role-playing game hybrid distributed as CLI, written in Python 3.

Life is Fantasy Epic (LIFE) A fantasy life simulator and role-playing game hybrid distributed as CLI, written in Python 3. This repository will be pro

Pawitchaya Chaloeijanya 2 Oct 24, 2021
A command line tool to remove background from video and image

A command line tool to remove background from video and image, brought to you by BackgroundRemover.app which is an app made by nadermx powered by this tool

Johnathan Nader 1.7k Jan 01, 2023
asciinema - Terminal session recorder 📹

asciinema - Terminal session recorder 📹

asciinema 11.1k Dec 27, 2022
CLI based diff viewer

Rich Diff CLI based diff viewer

Suresh Kumar 24 Nov 15, 2022
Command line parser for common log format (Nginx default).

Command line parser for common log format (Nginx default).

Lucian Marin 138 Dec 19, 2022
A Simple Python CLI Lockpicking Tool

Cryptex a simple CLI lockpicking tool What can it do: Encode / Decode Hex Encode / Decode Base64 Break Randomly :D Requirements: Python3 Linux as your

Alex Kollar 23 Jul 04, 2022
grungegirl is the hacker's drug encyclopedia. programmed in python for maximum modularity and ease of configuration.

grungegirl. cli-based drug search for girls. welcome. grungegirl is aiming to be the premier drug culture application. it is the hacker's encyclopedia

Eristava 10 Oct 02, 2022
Simple command line tool to train and deploy your machine learning models with AWS SageMaker

metamaker Simple command line tool to train and deploy your machine learning models with AWS SageMaker Features metamaker enables you to: Build a dock

Yasuhiro Yamaguchi 5 Jan 09, 2022
img-proof (IPA) provides a command line utility to test images in the Public Cloud

overview img-proof (IPA) provides a command line utility to test images in the Public Cloud (AWS, Azure, GCE, etc.). With img-proof you can now test c

13 Jan 07, 2022
Ideas on how to quickly learn to build command-line tools

CLI-Bootcamp Ideas on how to quickly learn to build command-line tools Part 1-Bash Week1: Using Linux Lesson 1: Using Linux Shell Lab Lesson 2: How sh

Noah Gift 10 Apr 18, 2022
🖥️ A cross-platform modern shell.

Ergonomica WARNING: master on this repository is not the same as a stable release! Currently, this software is purely experimental, as I am cleaning i

813 Dec 27, 2022
Low-Cost Open Source Ventilator or PAPR

Last updated 2020/04/19 Low-Cost Open-Source Ventilator-ish Device or PAPR NOTE: This is currently an independent project not affiliated with any comm

Johnny Lee 1.7k Dec 21, 2022
Objexplore is an interactive Python object explorer for the terminal.

Objexplore is an interactive Python object explorer for the terminal. Use it while debugging, or exploring a new library, or whatever! 9D1FAC73-B2A5-4

kylepollina 249 Dec 23, 2022
This is the public repo for the VS Code Extension AT&T i386/IA32 UIUC-ECE391 Syntax Highlighting

AT&T i386 IA32 UIUC ECE391 GCC Highlighter & Snippet & Linter This is the VS Code Extension for UIUC ECE 391, MIT 6.828, and all other AT&T-based i386

Jackgetup 1 Feb 05, 2022
dcargs is a tool for generating portable, reusable, and strongly typed CLI interfaces from dataclass definitions.

dcargs is a tool for generating portable, reusable, and strongly typed CLI interfaces from dataclass definitions.

Brent Yi 119 Jan 09, 2023
Runs a command in P4wnP1 and displays the output on OLED screen (SH1106)

p4wnp1-oled-terminal Runs a command in P4wnP1 and displays the output on OLED screen (SH1106) Works on Raspberry Pi Zero 2 W Tested successfully on RP

PawnSolo 1 Dec 14, 2021
The WalletsNet CLI helps you connect to WalletsNet

WalletsNet CLI The WalletsNet CLI helps you connect to WalletsNet. With the CLI, you can: Trigger webhook events or resend events for easy testing Tai

WalletsClub 8 Dec 22, 2021
A simple automation script that logs into your kra account and files your taxes with one command

EASY_TAX A simple automation script that logs into your kra account and files your taxes with one command Currently works for Chrome users. Will creat

leon koech 13 Sep 23, 2021