# Standalone

As a Java application, Metarank can be run locally either as a JAR-file or Docker container, there is no need for Kubernetes and AWS to start playing with it. Check out the [installation guide](https://docs.metarank.ai/reference/installation) for detailed setup instructions.

## Running modes

Metarank has multiple running modes:

* `import` - import historical clickthroughs to the store
* `train` - run traing the machine learning model using the imported data
* `serve` - start the ranking inference API
* `standalone` - which is a shortcut for `import`, `train` and `serve` jobs run together.
* `validate` - a set of sanity checks on your configuration file and event dataset.

Metarank's standalone mode is made to simplify the initial onboarding on the system:

* it's a shortcut to run [`import`, `train` and `serve`](https://docs.metarank.ai/reference/cli) tasks all at once
* with [memory persistence](https://docs.metarank.ai/overview/persistence#memory-persistence) it can process large clickthrough histories almost instantly.

## Why standalone?

Standalone mode is useful for these cases:

* testing Metarank without deployment. With [in-memory persistence](https://docs.metarank.ai/overview/persistence#memory-persistence) it has zero service dependencies and is the easiest way to try it out.
* simple staging deployments on VM/on-prem hardware. With [redis persistence](https://docs.metarank.ai/overview/persistence#redis-persistence) it can handle typical cases with small/medium load.

Standalone mode has the following limitations:

* feedback ingestion and inference throughput are limited by a single node. Please use the [Kubernetes deployment](https://docs.metarank.ai/reference/deployment-overview/kubernetes) for a better experience.
* model training happens within the inference process, and is a memory hungry process, which may cause latency spikes and OOMs. To overcome this limitation, you can train the machine learning model externally and upload it to the same Redis instance.

## Running Metarank in standalone mode

To run the JAR file, make sure to follow the [installation manual for your OS](https://docs.metarank.ai/reference/installation) and run it:

```bash
$ java -jar metarank.jar standalone --data /path/to/events.json --config /path/to/config.yml
```

Another option is to run Metarank standalone mode from a docker container:

```bash
$ docker run -v /data/:<path to data dir> metarank/metarank:latest standalone --data /data/events.json --config /data/config.yml
```

The follwing options are used for the docker container:

* `-v /data:<path to data dir>` to map a directory with input files and configuration into the container
* `--data /data/events.json` to pass the name of [input events file](https://docs.metarank.ai/reference/event-schema), from the mapped volume
* `--config /data/config.yml` to pass the [configuration file](https://docs.metarank.ai/reference/overview)

During the startup process Metarank will:

* import your dataset and compute all historical event statistics useful for machine learning model training
* train the machine learning model you defined in the configuration file
* start the inference API for real-time personaization.

![import and training process](https://754461178-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FP51TUyWn10Vg5Y0r7pvt%2Fuploads%2Fgit-blob-2a8e6611b978c9da1e216fbac81c99f79ac2b83b%2Ftraining.gif?alt=media)

For a more detailed walkthrough of running Metarank in playground, check out the [quickstart guide](https://docs.metarank.ai/introduction/quickstart).
