Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Commit

Permalink
[R][DOC] update R installation guide (#6457)
Browse files Browse the repository at this point in the history
  • Loading branch information
thirdwing authored and piiswrong committed May 26, 2017
1 parent 65d3dcb commit 8713d25
Showing 1 changed file with 240 additions and 9 deletions.
249 changes: 240 additions & 9 deletions docs/get_started/install.md
Original file line number Diff line number Diff line change
Expand Up @@ -648,8 +648,200 @@ You could also run distributed deeplearning with *MXNet* on AWS using [Cloudform

<!-- END - Cloud Python Installation Instructions -->


<!-- START - MacOS R CPU Installation Instructions -->

<div class="macos">
<div class="r">
<div class="cpu">

The CPU version of MXNet R package can be installed in R like other packages

```r
install.packages("drat")
drat::addRepo("dmlc")
install.packages("mxnet")
```


</div>


<div class="gpu">

Will be available soon.

</div>

</div>
</div>
<!-- END - MacOS R CPU Installation Instructions -->


<div class="linux">
<div class="r">
<div class="cpu">
<br/>

Building *MXNet* from source is a 2 step process.
1. Build the *MXNet* core shared library, `libmxnet.so`, from the C++ sources.
2. Build the language specific bindings.

**Minimum Requirements**
1. [GCC 4.8](https://gcc.gnu.org/gcc-4.8/) or later to compile C++ 11.
2. [GNU Make](https://www.gnu.org/software/make/)

<br/>

**Build the MXNet core shared library**

**Step 1** Install build tools and git.
```bash
$ sudo apt-get update
$ sudo apt-get install -y build-essential git
```

**Step 2** Install OpenBLAS.

*MXNet* uses [BLAS](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms) library for accelerated numerical computations on CPU machine. There are several flavors of BLAS libraries - [OpenBLAS](http://www.openblas.net/), [ATLAS](http://math-atlas.sourceforge.net/) and [MKL](https://software.intel.com/en-us/intel-mkl). In this step we install OpenBLAS. You can choose to install ATLAS or MKL.
```bash
$ sudo apt-get install -y libopenblas-dev
```

**Step 3** Install OpenCV.

*MXNet* uses [OpenCV](http://opencv.org/) for efficient image loading and augmentation operations.
```bash
$ sudo apt-get install -y libopencv-dev
```

**Step 4** Download MXNet sources and build MXNet core shared library.

```bash
$ git clone --recursive https://github.com/dmlc/mxnet
$ cd mxnet
$ make -j $(nproc) USE_OPENCV=1 USE_BLAS=openblas
```

*Note* - USE_OPENCV and USE_BLAS are make file flags to set compilation options to use OpenCV and BLAS library. You can explore and use more compilation options in `make/config.mk`.

<br/>

**Build and install the MXNet R binding**


```bash
$ make rpkg
$ R CMD INSTALL mxnet_current_r.tar.gz
```


</div>

<div class="gpu">

The following installation instructions have been tested on Ubuntu 14.04 and 16.04.


**Prerequisites**

Install the following NVIDIA libraries to setup *MXNet* with GPU support:

1. Install CUDA 8.0 following the NVIDIA's [installation guide](http://docs.nvidia.com/cuda/cuda-installation-guide-linux/).
2. Install cuDNN 5 for CUDA 8.0 following the NVIDIA's [installation guide](https://developer.nvidia.com/cudnn). You may need to register with NVIDIA for downloading the cuDNN library.

**Note:** Make sure to add CUDA install path to `LD_LIBRARY_PATH`.

Example - *export LD_LIBRARY_PATH=/usr/local/cuda/lib64/:$LD_LIBRARY_PATH*

<br/>

Building *MXNet* from source is a 2 step process.
1. Build the *MXNet* core shared library, `libmxnet.so`, from the C++ sources.
2. Build the language specific bindings.

**Minimum Requirements**
1. [GCC 4.8](https://gcc.gnu.org/gcc-4.8/) or later to compile C++ 11.
2. [GNU Make](https://www.gnu.org/software/make/)

<br/>

**Build the MXNet core shared library**

**Step 1** Install build tools and git.
```bash
$ sudo apt-get update
$ sudo apt-get install -y build-essential git
```
**Step 2** Install OpenBLAS.

*MXNet* uses [BLAS](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms) library for accelerated numerical computations. There are several flavors of BLAS libraries - [OpenBLAS](http://www.openblas.net/), [ATLAS](http://math-atlas.sourceforge.net/) and [MKL](https://software.intel.com/en-us/intel-mkl). In this step we install OpenBLAS. You can choose to install ATLAS or MKL.
```bash
$ sudo apt-get install -y libopenblas-dev
```

**Step 3** Install OpenCV.

*MXNet* uses [OpenCV](http://opencv.org/) for efficient image loading and augmentation operations.
```bash
$ sudo apt-get install -y libopencv-dev
```

**Step 4** Download MXNet sources and build MXNet core shared library.

```bash
$ git clone --recursive https://github.com/dmlc/mxnet
$ cd mxnet
$ make -j $(nproc) USE_OPENCV=1 USE_BLAS=openblas USE_CUDA=1 USE_CUDA_PATH=/usr/local/cuda USE_CUDNN=1
```

*Note* - USE_OPENCV, USE_BLAS, USE_CUDA, USE_CUDA_PATH AND USE_CUDNN are make file flags to set compilation options to use OpenCV, OpenBLAS, CUDA and cuDNN libraries. You can explore and use more compilation options in `make/config.mk`. Make sure to set USE_CUDA_PATH to right CUDA installation path. In most cases it is - */usr/local/cuda*.

<br/>

**Build and install the MXNet R binding**

```bash
$ make rpkg
$ R CMD INSTALL mxnet_current_r.tar.gz
```

</div>

</div>
</div>


<!-- START - Windows R CPU Installation Instructions -->

<div class="windows">
<div class="r">
<div class="cpu">

The CPU version of MXNet R package can be installed in R like other packages


```r
install.packages("drat")
drat::addRepo("dmlc")
install.packages("mxnet")
```


</div>

<div class="gpu">

Will be available soon.

</div>
</div>
</div>

<!-- END - Windows R CPU Installation Instructions -->

<div class="linux">
<div class="scala r julia perl">
<div class="scala julia perl">
<div class="cpu gpu">

Follow the installation instructions [in this guide](./ubuntu_setup.md) to set up MXNet.
Expand All @@ -659,7 +851,7 @@ Follow the installation instructions [in this guide](./ubuntu_setup.md) to set u
</div>

<div class="macos">
<div class="scala r julia perl">
<div class="scala julia perl">
<div class="cpu gpu">

Follow the installation instructions [in this guide](./osx_setup.md) to set up MXNet.
Expand All @@ -669,8 +861,8 @@ Follow the installation instructions [in this guide](./osx_setup.md) to set up M
</div>

<div class="windows">
<div class="python scala r julia perl">
<div class="cpu gpu">
<div class="python scala julia perl">
<div class="gpu">

Follow the installation instructions [in this guide](./windows_setup.md) to set up MXNet.

Expand Down Expand Up @@ -1068,7 +1260,7 @@ Start the python terminal.
```bash
$ python
```
<!-- Example code for CPU -->
<!-- Example Python code for CPU -->

<div class="cpu">

Expand All @@ -1092,7 +1284,7 @@ $

</div>

<!-- Example code for CPU -->
<!-- Example Python code for CPU -->

<div class="gpu">

Expand All @@ -1111,8 +1303,47 @@ array([[ 3., 3., 3.],

</div>

<!-- Example R code for CPU -->

<div class="linux macos windows">
<div class="r">
<div class="cpu">

Run a short *MXNet* python program to create a 2X3 matrix of ones, multiply each element in the matrix by 2 followed by adding 1. We expect the output to be a 2X3 matrix with all elements being 3.

```r
library(mxnet)
a <- mx.nd.ones(c(2,3), ctx = mx.cpu())
b <- a * 2 + 1
b
```

</div>
</div>
</div>

<!-- Example R code for GPU -->

<div class="linux macos windows">
<div class="r">
<div class="gpu">

Run a short *MXNet* python program to create a 2X3 matrix of ones *a* on a *GPU*, multiply each element in the matrix by 2 followed by adding 1. We expect the output to be a 2X3 matrix with all elements being 3. We use *mx.gpu()*, to set *MXNet* context to be GPUs.

```r
library(mxnet)
a <- mx.nd.ones(c(2,3), ctx = mx.gpu())
b <- a * 2 + 1
b
```

</div>
</div>
</div>


<div class="linux">
<div class="scala r julia perl">
<div class="scala julia perl">
<div class="cpu gpu">

Will be available soon.
Expand All @@ -1122,7 +1353,7 @@ Will be available soon.
</div>

<div class="macos">
<div class="scala r julia perl">
<div class="scala julia perl">
<div class="cpu gpu">

Will be available soon.
Expand All @@ -1132,7 +1363,7 @@ Will be available soon.
</div>

<div class="windows">
<div class="python scala r julia perl">
<div class="python scala julia perl">
<div class="cpu gpu">

Will be available soon.
Expand Down

0 comments on commit 8713d25

Please sign in to comment.