index.rst 6.0 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122
  1. .. _tune-main:
  2. Tune: Scalable Hyperparameter Tuning
  3. ====================================
  4. .. tip:: We'd love to hear your feedback on using Tune - `get in touch <https://forms.gle/PTRvGLbKRdUfuzQo9>`_!
  5. .. image:: /images/tune.png
  6. :scale: 30%
  7. :align: center
  8. Tune is a Python library for experiment execution and hyperparameter tuning at any scale. Core features:
  9. * Launch a multi-node :ref:`distributed hyperparameter sweep <tune-distributed>` in less than 10 lines of code.
  10. * Supports any machine learning framework, :ref:`including PyTorch, XGBoost, MXNet, and Keras <tune-guides>`.
  11. * Automatically manages :ref:`checkpoints <tune-checkpoint>` and logging to :ref:`TensorBoard <tune-logging>`.
  12. * Choose among state of the art algorithms such as :ref:`Population Based Training (PBT) <tune-scheduler-pbt>`, :ref:`BayesOptSearch <bayesopt>`, :ref:`HyperBand/ASHA <tune-scheduler-hyperband>`.
  13. * Move your models from training to serving on the same infrastructure with `Ray Serve`_.
  14. .. _`Ray Serve`: serve/index.html
  15. **Want to get started?** Head over to the :doc:`Key Concepts page </tune/key-concepts>`.
  16. Quick Start
  17. -----------
  18. To run this example, install the following: ``pip install 'ray[tune]'``.
  19. This example runs a parallel grid search to optimize an example objective function.
  20. .. literalinclude:: ../../../python/ray/tune/tests/example.py
  21. :language: python
  22. :start-after: __quick_start_begin__
  23. :end-before: __quick_start_end__
  24. If TensorBoard is installed, automatically visualize all trial results:
  25. .. code-block:: bash
  26. tensorboard --logdir ~/ray_results
  27. .. image:: /images/tune-start-tb.png
  28. :scale: 30%
  29. :align: center
  30. If using TF2 and TensorBoard, Tune will also automatically generate TensorBoard HParams output:
  31. .. image:: /images/tune-hparams-coord.png
  32. :scale: 20%
  33. :align: center
  34. Why choose Tune?
  35. ----------------
  36. There are many other hyperparameter optimization libraries out there. If you're new to Tune, you're probably wondering, "what makes Tune different?"
  37. Cutting-edge optimization algorithms
  38. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  39. As a user, you're probably looking into hyperparameter optimization because you want to quickly increase your model performance.
  40. Tune enables you to leverage a variety of these cutting edge optimization algorithms, reducing the cost of tuning by `aggressively terminating bad hyperparameter evaluations <tune-scheduler-hyperband>`_, intelligently :ref:`choosing better parameters to evaluate <tune-search-alg>`, or even :ref:`changing the hyperparameters during training <tune-scheduler-pbt>` to optimize hyperparameter schedules.
  41. First-class Developer Productivity
  42. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  43. A key problem with machine learning frameworks is the need to restructure all of your code to fit the framework.
  44. With Tune, you can optimize your model just by :ref:`adding a few code snippets <tune-tutorial>`.
  45. Further, Tune actually removes boilerplate from your code training workflow, automatically :ref:`managing checkpoints <tune-checkpoint>` and :ref:`logging results to tools <tune-logging>` such as MLflow and TensorBoard.
  46. Multi-GPU & distributed training out of the box
  47. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  48. Hyperparameter tuning is known to be highly time-consuming, so it is often necessary to parallelize this process. Most other tuning frameworks require you to implement your own multi-process framework or build your own distributed system to speed up hyperparameter tuning.
  49. However, Tune allows you to transparently :ref:`parallelize across multiple GPUs and multiple nodes <tune-parallelism>`. Tune even has seamless :ref:`fault tolerance and cloud support <tune-distributed>`, allowing you to scale up your hyperparameter search by 100x while reducing costs by up to 10x by using cheap preemptible instances.
  50. What if I'm already doing hyperparameter tuning?
  51. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  52. You might be already using an existing hyperparameter tuning tool such as HyperOpt or Bayesian Optimization.
  53. In this situation, Tune actually allows you to power up your existing workflow. Tune's :ref:`Search Algorithms <tune-search-alg>` integrate with a variety of popular hyperparameter tuning libraries (such as Nevergrad or HyperOpt) and allow you to seamlessly scale up your optimization process -- without sacrificing performance.
  54. Reference Materials
  55. -------------------
  56. Here are some reference materials for Tune:
  57. * :doc:`/tune/user-guide`
  58. * :ref:`Frequently asked questions <tune-faq>`
  59. * `Code <https://github.com/ray-project/ray/tree/master/python/ray/tune>`__: GitHub repository for Tune
  60. Below are some blog posts and talks about Tune:
  61. - [blog] `Tune: a Python library for fast hyperparameter tuning at any scale <https://towardsdatascience.com/fast-hyperparameter-tuning-at-scale-d428223b081c>`_
  62. - [blog] `Cutting edge hyperparameter tuning with Ray Tune <https://medium.com/riselab/cutting-edge-hyperparameter-tuning-with-ray-tune-be6c0447afdf>`_
  63. - [blog] `Simple hyperparameter and architecture search in tensorflow with Ray Tune <http://louiskirsch.com/ai/ray-tune>`_
  64. - [slides] `Talk given at RISECamp 2019 <https://docs.google.com/presentation/d/1v3IldXWrFNMK-vuONlSdEuM82fuGTrNUDuwtfx4axsQ/edit?usp=sharing>`_
  65. - [video] `Talk given at RISECamp 2018 <https://www.youtube.com/watch?v=38Yd_dXW51Q>`_
  66. - [video] `A Guide to Modern Hyperparameter Optimization (PyData LA 2019) <https://www.youtube.com/watch?v=10uz5U3Gy6E>`_ (`slides <https://speakerdeck.com/richardliaw/a-modern-guide-to-hyperparameter-optimization>`_)
  67. Citing Tune
  68. -----------
  69. If Tune helps you in your academic research, you are encouraged to cite `our paper <https://arxiv.org/abs/1807.05118>`__. Here is an example bibtex:
  70. .. code-block:: tex
  71. @article{liaw2018tune,
  72. title={Tune: A Research Platform for Distributed Model Selection and Training},
  73. author={Liaw, Richard and Liang, Eric and Nishihara, Robert
  74. and Moritz, Philipp and Gonzalez, Joseph E and Stoica, Ion},
  75. journal={arXiv preprint arXiv:1807.05118},
  76. year={2018}
  77. }