index.rst 12 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363
  1. What is Ray?
  2. ============
  3. .. include:: ray-overview/basics.rst
  4. Getting Started with Ray
  5. ------------------------
  6. Check out :ref:`gentle-intro` to learn more about Ray and its ecosystem of libraries that enable things like distributed hyperparameter tuning,
  7. reinforcement learning, and distributed training.
  8. Ray provides Python, Java, and *EXPERIMENTAL* C++ API. And Ray uses Tasks (functions) and Actors (Classes) to allow you to parallelize your code.
  9. .. tabs::
  10. .. group-tab:: Python
  11. .. code-block:: python
  12. # First, run `pip install ray`.
  13. import ray
  14. ray.init()
  15. @ray.remote
  16. def f(x):
  17. return x * x
  18. futures = [f.remote(i) for i in range(4)]
  19. print(ray.get(futures)) # [0, 1, 4, 9]
  20. @ray.remote
  21. class Counter(object):
  22. def __init__(self):
  23. self.n = 0
  24. def increment(self):
  25. self.n += 1
  26. def read(self):
  27. return self.n
  28. counters = [Counter.remote() for i in range(4)]
  29. [c.increment.remote() for c in counters]
  30. futures = [c.read.remote() for c in counters]
  31. print(ray.get(futures)) # [1, 1, 1, 1]
  32. .. group-tab:: Java
  33. First, add the `ray-api <https://mvnrepository.com/artifact/io.ray/ray-api>`__ and `ray-runtime <https://mvnrepository.com/artifact/io.ray/ray-runtime>`__ dependencies in your project.
  34. .. code-block:: java
  35. import io.ray.api.ActorHandle;
  36. import io.ray.api.ObjectRef;
  37. import io.ray.api.Ray;
  38. import java.util.ArrayList;
  39. import java.util.List;
  40. import java.util.stream.Collectors;
  41. public class RayDemo {
  42. public static int square(int x) {
  43. return x * x;
  44. }
  45. public static class Counter {
  46. private int value = 0;
  47. public void increment() {
  48. this.value += 1;
  49. }
  50. public int read() {
  51. return this.value;
  52. }
  53. }
  54. public static void main(String[] args) {
  55. // Intialize Ray runtime.
  56. Ray.init();
  57. {
  58. List<ObjectRef<Integer>> objectRefList = new ArrayList<>();
  59. // Invoke the `square` method 4 times remotely as Ray tasks.
  60. // The tasks will run in parallel in the background.
  61. for (int i = 0; i < 4; i++) {
  62. objectRefList.add(Ray.task(RayDemo::square, i).remote());
  63. }
  64. // Get the actual results of the tasks with `get`.
  65. System.out.println(Ray.get(objectRefList)); // [0, 1, 4, 9]
  66. }
  67. {
  68. List<ActorHandle<Counter>> counters = new ArrayList<>();
  69. // Create 4 actors from the `Counter` class.
  70. // They will run in remote worker processes.
  71. for (int i = 0; i < 4; i++) {
  72. counters.add(Ray.actor(Counter::new).remote());
  73. }
  74. // Invoke the `increment` method on each actor.
  75. // This will send an actor task to each remote actor.
  76. for (ActorHandle<Counter> counter : counters) {
  77. counter.task(Counter::increment).remote();
  78. }
  79. // Invoke the `read` method on each actor, and print the results.
  80. List<ObjectRef<Integer>> objectRefList = counters.stream()
  81. .map(counter -> counter.task(Counter::read).remote())
  82. .collect(Collectors.toList());
  83. System.out.println(Ray.get(objectRefList)); // [1, 1, 1, 1]
  84. }
  85. }
  86. }
  87. .. group-tab:: C++ (EXPERIMENTAL)
  88. | The C++ Ray API is currently experimental with limited support. You can track its development `here <https://github.com/ray-project/ray/milestone/17>`__ and report issues on GitHub.
  89. | Run the following commands to get started:
  90. | - Build ray from source with *bazel* as shown `here <https://docs.ray.io/en/master/development.html#building-ray-full>`__.
  91. | - Modify and build `cpp/example/example.cc`.
  92. .. code-block:: shell
  93. bazel build //cpp/example:example
  94. | Option 1: run the example directly with a dynamic library path. It will start a Ray cluster automatically.
  95. .. code-block:: shell
  96. ray stop
  97. ./bazel-bin/cpp/example/example --dynamic-library-path=bazel-bin/cpp/example/example.so
  98. | Option 2: connect to an existing Ray cluster with a known redis address (e.g. `127.0.0.1:6379`).
  99. .. code-block:: shell
  100. ray stop
  101. ray start --head --port 6379 --redis-password 5241590000000000 --node-manager-port 62665
  102. ./bazel-bin/cpp/example/example --dynamic-library-path=bazel-bin/cpp/example/example.so --redis-address=127.0.0.1:6379
  103. .. literalinclude:: ../../cpp/example/example.cc
  104. :language: cpp
  105. You can also get started by visiting our `Tutorials <https://github.com/ray-project/tutorial>`_. For the latest wheels (nightlies), see the `installation page <installation.html>`__.
  106. Getting Involved
  107. ================
  108. .. include:: ray-overview/involvement.rst
  109. If you're interested in contributing to Ray, visit our page on :ref:`Getting Involved <getting-involved>` to read about the contribution process and see what you can work on!
  110. More Information
  111. ================
  112. Here are some talks, papers, and press coverage involving Ray and its libraries. Please raise an issue if any of the below links are broken, or if you'd like to add your own talk!
  113. Blog and Press
  114. --------------
  115. - `Modern Parallel and Distributed Python: A Quick Tutorial on Ray <https://towardsdatascience.com/modern-parallel-and-distributed-python-a-quick-tutorial-on-ray-99f8d70369b8>`_
  116. - `Why Every Python Developer Will Love Ray <https://www.datanami.com/2019/11/05/why-every-python-developer-will-love-ray/>`_
  117. - `Ray: A Distributed System for AI (BAIR) <http://bair.berkeley.edu/blog/2018/01/09/ray/>`_
  118. - `10x Faster Parallel Python Without Python Multiprocessing <https://towardsdatascience.com/10x-faster-parallel-python-without-python-multiprocessing-e5017c93cce1>`_
  119. - `Implementing A Parameter Server in 15 Lines of Python with Ray <https://ray-project.github.io/2018/07/15/parameter-server-in-fifteen-lines.html>`_
  120. - `Ray Distributed AI Framework Curriculum <https://rise.cs.berkeley.edu/blog/ray-intel-curriculum/>`_
  121. - `RayOnSpark: Running Emerging AI Applications on Big Data Clusters with Ray and Analytics Zoo <https://medium.com/riselab/rayonspark-running-emerging-ai-applications-on-big-data-clusters-with-ray-and-analytics-zoo-923e0136ed6a>`_
  122. - `First user tips for Ray <https://rise.cs.berkeley.edu/blog/ray-tips-for-first-time-users/>`_
  123. - [Tune] `Tune: a Python library for fast hyperparameter tuning at any scale <https://towardsdatascience.com/fast-hyperparameter-tuning-at-scale-d428223b081c>`_
  124. - [Tune] `Cutting edge hyperparameter tuning with Ray Tune <https://medium.com/riselab/cutting-edge-hyperparameter-tuning-with-ray-tune-be6c0447afdf>`_
  125. - [RLlib] `New Library Targets High Speed Reinforcement Learning <https://www.datanami.com/2018/02/01/rays-new-library-targets-high-speed-reinforcement-learning/>`_
  126. - [RLlib] `Scaling Multi Agent Reinforcement Learning <http://bair.berkeley.edu/blog/2018/12/12/rllib/>`_
  127. - [RLlib] `Functional RL with Keras and Tensorflow Eager <https://bair.berkeley.edu/blog/2019/10/14/functional-rl/>`_
  128. - [Modin] `How to Speed up Pandas by 4x with one line of code <https://www.kdnuggets.com/2019/11/speed-up-pandas-4x.html>`_
  129. - [Modin] `Quick Tip – Speed up Pandas using Modin <https://pythondata.com/quick-tip-speed-up-pandas-using-modin/>`_
  130. - `Ray Blog`_
  131. .. _`Ray Blog`: https://ray-project.github.io/
  132. Talks (Videos)
  133. --------------
  134. - `Programming at any Scale with Ray | SF Python Meetup Sept 2019 <https://www.youtube.com/watch?v=LfpHyIXBhlE>`_
  135. - `Ray for Reinforcement Learning | Data Council 2019 <https://www.youtube.com/watch?v=Ayc0ca150HI>`_
  136. - `Scaling Interactive Pandas Workflows with Modin <https://www.youtube.com/watch?v=-HjLd_3ahCw>`_
  137. - `Ray: A Distributed Execution Framework for AI | SciPy 2018 <https://www.youtube.com/watch?v=D_oz7E4v-U0>`_
  138. - `Ray: A Cluster Computing Engine for Reinforcement Learning Applications | Spark Summit <https://www.youtube.com/watch?v=xadZRRB_TeI>`_
  139. - `RLlib: Ray Reinforcement Learning Library | RISECamp 2018 <https://www.youtube.com/watch?v=eeRGORQthaQ>`_
  140. - `Enabling Composition in Distributed Reinforcement Learning | Spark Summit 2018 <https://www.youtube.com/watch?v=jAEPqjkjth4>`_
  141. - `Tune: Distributed Hyperparameter Search | RISECamp 2018 <https://www.youtube.com/watch?v=38Yd_dXW51Q>`_
  142. Slides
  143. ------
  144. - `Talk given at UC Berkeley DS100 <https://docs.google.com/presentation/d/1sF5T_ePR9R6fAi2R6uxehHzXuieme63O2n_5i9m7mVE/edit?usp=sharing>`_
  145. - `Talk given in October 2019 <https://docs.google.com/presentation/d/13K0JsogYQX3gUCGhmQ1PQ8HILwEDFysnq0cI2b88XbU/edit?usp=sharing>`_
  146. - [Tune] `Talk given at RISECamp 2019 <https://docs.google.com/presentation/d/1v3IldXWrFNMK-vuONlSdEuM82fuGTrNUDuwtfx4axsQ/edit?usp=sharing>`_
  147. Papers
  148. ------
  149. - `Ray 1.0 Architecture whitepaper`_ **(new)**
  150. - `Ray Design Patterns`_ **(new)**
  151. - `RLlib paper`_
  152. - `RLlib flow paper`_
  153. - `Tune paper`_
  154. *Older papers:*
  155. - `Ray paper`_
  156. - `Ray HotOS paper`_
  157. .. _`Ray 1.0 Architecture whitepaper`: https://docs.google.com/document/d/1lAy0Owi-vPz2jEqBSaHNQcy2IBSDEHyXNOQZlGuj93c/preview
  158. .. _`Ray Design Patterns`: https://docs.google.com/document/d/167rnnDFIVRhHhK4mznEIemOtj63IOhtIPvSYaPgI4Fg/edit
  159. .. _`Ray paper`: https://arxiv.org/abs/1712.05889
  160. .. _`Ray HotOS paper`: https://arxiv.org/abs/1703.03924
  161. .. _`RLlib paper`: https://arxiv.org/abs/1712.09381
  162. .. _`RLlib flow paper`: https://arxiv.org/abs/2011.12719
  163. .. _`Tune paper`: https://arxiv.org/abs/1807.05118
  164. .. toctree::
  165. :hidden:
  166. :maxdepth: -1
  167. :caption: Overview of Ray
  168. ray-overview/index.rst
  169. ray-libraries.rst
  170. installation.rst
  171. .. toctree::
  172. :hidden:
  173. :maxdepth: -1
  174. :caption: Ray Core
  175. walkthrough.rst
  176. using-ray.rst
  177. configure.rst
  178. ray-dashboard.rst
  179. Tutorial and Examples <auto_examples/overview.rst>
  180. package-ref.rst
  181. .. toctree::
  182. :hidden:
  183. :maxdepth: -1
  184. :caption: Ray Clusters/Autoscaler
  185. cluster/index.rst
  186. cluster/quickstart.rst
  187. cluster/reference.rst
  188. cluster/cloud.rst
  189. cluster/deploy.rst
  190. .. toctree::
  191. :hidden:
  192. :maxdepth: -1
  193. :caption: Ray Serve
  194. serve/index.rst
  195. serve/tutorial.rst
  196. serve/core-apis.rst
  197. serve/http-servehandle.rst
  198. serve/deployment.rst
  199. serve/ml-models.rst
  200. serve/advanced-traffic.rst
  201. serve/performance.rst
  202. serve/architecture.rst
  203. serve/tutorials/index.rst
  204. serve/faq.rst
  205. serve/package-ref.rst
  206. .. toctree::
  207. :hidden:
  208. :maxdepth: -1
  209. :caption: Ray Tune
  210. tune/index.rst
  211. tune/key-concepts.rst
  212. tune/user-guide.rst
  213. tune/tutorials/overview.rst
  214. tune/examples/index.rst
  215. tune/api_docs/overview.rst
  216. tune/contrib.rst
  217. .. toctree::
  218. :hidden:
  219. :maxdepth: -1
  220. :caption: RLlib
  221. rllib.rst
  222. rllib-toc.rst
  223. rllib-training.rst
  224. rllib-env.rst
  225. rllib-models.rst
  226. rllib-algorithms.rst
  227. rllib-sample-collection.rst
  228. rllib-offline.rst
  229. rllib-concepts.rst
  230. rllib-examples.rst
  231. rllib-package-ref.rst
  232. rllib-dev.rst
  233. .. toctree::
  234. :hidden:
  235. :maxdepth: -1
  236. :caption: Ray SGD
  237. raysgd/raysgd.rst
  238. raysgd/raysgd_pytorch.rst
  239. raysgd/raysgd_tensorflow.rst
  240. raysgd/raysgd_dataset.rst
  241. raysgd/raysgd_ptl.rst
  242. raysgd/raysgd_tune.rst
  243. raysgd/raysgd_ref.rst
  244. .. toctree::
  245. :hidden:
  246. :maxdepth: -1
  247. :caption: Data Processing
  248. modin/index.rst
  249. dask-on-ray.rst
  250. mars-on-ray.rst
  251. raydp.rst
  252. .. toctree::
  253. :hidden:
  254. :maxdepth: -1
  255. :caption: More Libraries
  256. multiprocessing.rst
  257. joblib.rst
  258. iter.rst
  259. xgboost-ray.rst
  260. ray-client.rst
  261. .. toctree::
  262. :hidden:
  263. :maxdepth: -1
  264. :caption: Ray Observability
  265. ray-metrics.rst
  266. ray-debugging.rst
  267. ray-logging.rst
  268. .. toctree::
  269. :hidden:
  270. :maxdepth: -1
  271. :caption: Contributing
  272. getting-involved.rst
  273. .. toctree::
  274. :hidden:
  275. :maxdepth: -1
  276. :caption: Development and Ray Internals
  277. development.rst
  278. whitepaper.rst
  279. debugging.rst
  280. profiling.rst
  281. fault-tolerance.rst