Skip to content

Real Life, Working Examples with Code for using Ray as Service

  • Basic


    Basic example using Ray to execute tasks in parallel on a Ray cluster.

  • Request GPU


    Request a GPU for a job submit the job to a remote Ray cluster.

  • Distributed Optimization


    Distributed Optimization using SciPy to find the minimum of a complex, high-dimensional objective function.

  • Distributed Training-TensorFlow


    TensorFlow based distributed training using TensorFlow's MirroredStrategy.

  • Distributed Training-PyTorch


    PyTorch based distributed training across multiple workers using randomly generated data.