Deploy Self-hosted Retool with Amazon EC2
Learn how to deploy Self-hosted Retool on Amazon EC2 with Docker Compose.
You can deploy Self-hosted Retool onto Amazon EC2 using Docker Compose. You also have the option to deploy with Retool Workflows.
Requirements
- Self-hosted Retool only
- Self-hosted Retool with Workflows
To deploy Self-hosted Retool on Amazon EC2 using Docker Compose, you need:
- A Retool license key, which you can obtain from my.retool.com or your Retool account manager.
- Familiarity with and installations of Docker Engine and Docker Compose.
Self-hosted deployments also require a Linux VM with the following:
- Ubuntu 22.04 or later.
x86
architecture.t3.large
instance type.
In addition, Retool recommends you:
- Follow this guide using an administrative, non-Root AWS user.
- Manage your service quotas for your Retool deployment's AWS Region as you scale.
To deploy Self-hosted Retool on Amazon EC2 using Docker Compose, you need:
- A Retool license key, which you can obtain from my.retool.com or your Retool account manager.
- Familiarity with and installations of Docker Engine and Docker Compose.
Self-hosted deployments also require a Linux VM with the following:
- Ubuntu 22.04 or later.
x86
architecture.t3.large
instance type.
In addition, Retool recommends you:
- Follow this guide using an administrative, non-Root AWS user.
- Manage your service quotas for your Retool deployment's AWS Region as you scale.
Temporal
Temporal is a distributed system used to schedule and run asynchronous tasks. A Self-hosted Retool with Workflows deployment uses a Temporal cluster to facilitate the execution of each Workflow amongst a pool of self-hosted workers that make queries and execute code in your VPC. Temporal manages the queueing, scheduling, and orchestration of workflows to guarantee that each workflow block executes in the correct order of the control flow. It does not store any block results by default.
You can use a Retool-managed cluster on Temporal Cloud, which is recommended for most use cases. You can also use an existing self-managed cluster that is hosted on Temporal Cloud or in your own infrastructure. Alternatively, you can spin up a new self-hosted cluster alongside your Self-hosted Retool instance.
- Retool-managed cluster
- Self-managed cluster
- Local cluster
Recommended
You should use a Retool-managed cluster unless:
- You are on a version earlier than 3.6.14.
- You are not on an Enterprise plan of Retool.
- You have an existing cluster and would prefer to use it.
- You need a cluster for uses other than a Self-hosted Retool deployment.
- You want to manage the cluster directly.
Retool admins can enable Retool-managed Temporal. To get started, navigate to the Retool Workflows page and click Enroll now. Once you update your configuration, return to the page and click Complete setup.
It can take a few minutes to initialize a namespace in Retool-managed Temporal.
Retool-managed Temporal clusters are hosted on Temporal Cloud. Your Self-hosted Retool deployment communicates with the cluster when building, deploying, and executing Retool Workflows. All orchestration data to Temporal is fully encrypted and uses the private encryption key set for your deployment.
If you want to create a new, self-hosted cluster on Temporal Cloud, sign up first. Once your account is provisioned, you can then deploy Self-hosted Retool with Workflows.
Temporal Cloud supports 10+ AWS regions from which to select, 99.99% availability, and 99.9% guarantee against service errors.
You can use an existing self-managed cluster, hosted on Temporal Cloud or in your own infrastructure, if you:
- Have have an existing cluster and would prefer to use another namespace within it.
- Need a cluster for uses other than a Self-hosted Retool deployment.
- Have a multi-instance Retool deployment, where each instance would have its own namespace in a shared Temporal cluster.
- Want to manage the cluster directly.
- Cannot use a Retool-managed cluster.
Self-hosted cluster considerations
Retool recommends using a separate datastore for the Workflows Queue in production. Consider using AWS Aurora Serverless V2 configured to an ACU (cpu) provision ranging from 0.5 to 8 ACU. 1 ACU can provide around 10 QPS. The Workflows Queue is write-heavy (around 100:1 write to read operations) and Aurora Serverless can scale to accommodate spikes in traffic without any extra configuration.
Environments
For test environments, Retool recommends using the same database for the Retool Database and Workflows Queue. Without any extra configuration, Retool Workflows can process approximately 5-10 QPS (roughly, 5-10 concurrent blocks executed per second).
Workflows at scale
You can scale Self-hosted Retool with Workflows to perform a high rate of concurrent blocks per second. If your deployment needs to process more than 10 workflows per second, you can use:
- A Retool-managed cluster.
- A self-managed cluster on Temporal Cloud.
- Apache Cassandra as the Temporal datastore.
If you anticipate running workflows at a higher scale, please reach out to us to work through a deployment strategy that is best for your use case.
You can spin up a new cluster alongside your Self-hosted Retool deployment if you:
- Cannot use a Retool-managed cluster.
- Don't have an existing cluster to use.
- Have a multi-instance Retool deployment, but each instance is in its own VPC and requires its own Temporal cluster.
- Want to test a Self-hosted Retool with Workflows deployment with a local cluster first.
Local cluster considerations
Retool recommends using a separate datastore for the Workflows Queue in production. Consider using AWS Aurora Serverless V2 configured to an ACU (cpu) provision ranging from 0.5 to 8 ACU. 1 ACU can provide around 10 QPS. The Workflows Queue is write-heavy (around 100:1 write to read operations) and Aurora Serverless can scale to accommodate spikes in traffic without any extra configuration.
Environments
For test environments, Retool recommends using the same database for the Retool Database and Workflows Queue. Without any extra configuration, Retool Workflows can process approximately 5-10 QPS (roughly, 5-10 concurrent blocks executed per second).
Workflows at scale
You can scale Self-hosted Retool with Workflows to perform a high rate of concurrent blocks per second. If your deployment needs to process more than 10 workflows per second, you can use:
- A Retool-managed cluster.
- A self-managed cluster on Temporal Cloud.
- Apache Cassandra as the Temporal datastore.
If you anticipate running workflows at a higher scale, please reach out to us to work through a deployment strategy that is best for your use case.
System architecture
The following diagram shows the resulting system architecture for your deployment.
Video walkthrough
Watch the video to see an example of deploying Retool on Amazon EC2, or get started by following the steps below.
1. Create a Linux EC2 instance
Create a new Linux EC2 instance in the Amazon EC2 console using an Amazon Machine Image (AMI) that meets the minimum requirements. Refer to the Amazon EC2 documentation to learn how to create a new instance.
Authentication
When creating the instance, use SSH public key authentication and provide a username. Generate a new key-pair and specify a name.
Network
Create or use an existing security group with the following inbound rules:
Port range | Type | Source |
---|---|---|
80 | HTTP | 0.0.0.0/0 and ::/0 |
443 | HTTPS | 0.0.0.0/0 and ::/0 |
22 | SSH | 0.0.0.0/0 and ::/0 |
3000 | Custom TCP | 0.0.0.0/0 and ::/0 |
Self-hosted Retool initally runs on port 3000. Once SSL is configured, this port is no longer required.
2. Download Self-hosted Retool
You can connect in the AWS console using EC2 Instance Connect, or on the command line using an SSH client, with the SSH key pair you selected in your EC2 dashboard.
ssh -i keypair.pem <username>@<public_ip>
Download or clone the retool-on-premise repository.
- Download with curl
- Clone git repository
curl -L -O https://github.com/tryretool/retool-onpremise/archive/master.zip \
&& unzip master.zip \
&& cd retool-onpremise-master
git clone https://github.com/tryretool/retool-onpremise.git
3. Set up Docker
Run the install.sh
script to install Docker and create the docker.env
file.
./install.sh
4. Update environment variables
Configure environment variables in docker.env
:
- Set
LICENSE_KEY
to your license key. - Uncomment
COOKIE_INSECURE=true
to use Self-hosted Retool without SSL. Once you configure SSL, set this tofalse
.
5. Back up encryption key
The install script generates a value for ENCRYPTION_KEY
and stores it within docker.env
. This key encrypts secrets for your Retool resources.
Save this key in a secure location outside of Retool.
6. Configure deployment
- Self-hosted Retool only
- Self-hosted Retool with Workflows
Set your Retool release version in Dockerfile
. We recommend using the most recent tag available in tryretool/backend.
By default docker-compose.yml
starts a worker service for use w/the Workflows product.
To stop this worker from starting up, comment or delete the following lines from docker-compose.yml
:
- https://github.com/tryretool/retool-onpremise/blob/cebbcc3/docker-compose.yml#L28
- https://github.com/tryretool/retool-onpremise/blob/cebbcc3/docker-compose.yml#L35
- https://github.com/tryretool/retool-onpremise/blob/cebbcc3/docker-compose.yml#L95-L119
- https://github.com/tryretool/retool-onpremise/blob/cebbcc3/docker-compose.yml#L165
Retool provides Docker Compose files to deploy different configurations of Self-hosted Retool. You must use Docker Compose to deploy Self-hosted Retool with Workflows.
Prepare Compose file
- Retool-managed cluster
- Self-managed cluster
- Local cluster
Use docker-compose.yml
as is.
Use docker-compose.yml
as is.
Update Workflow-related environment variables for the api
and workflows-worker
container:
WORKFLOW_TEMPORAL_CLUSTER_FRONTEND_HOST
WORKFLOW_TEMPORAL_CLUSTER_FRONTEND_PORT
WORKFLOW_TEMPORAL_TLS_ENABLED
WORKFLOW_TEMPORAL_TLS_CRT
WORKFLOW_TEMPORAL_TLS_KEY
Use docker-compose-workflows-with-temporal.yml
to deploy Self-hosted Retool with Workflows and spin up a local Temporal cluster.
- Remove the existing
docker-compose.yml
file. - Rename
docker-compose-workflows-with-temporal.yml
todocker-compose.yml
. - Copy the
dynamicconfig
directory inretool-onpremise
onto your deployment instance.
Specify release version
Set your Retool release version in Dockerfile
. We recommend using the most recent tag available in tryretool/backend. Workflows with a local Temporal cluster requires v2.108.0
or later, or 3.6.14 for Retool-managed Temporal. Identify the appropriate release version by viewing the tags on Docker Hub.
Configure Temporal
- Retool-managed cluster
- Self-managed cluster
- Local cluster
Allow your deployment to connect to Temporal
Open up egress to the public internet on ports 443
and 7233
to allow outbound-only connections to Temporal Cloud from your deployment. This allows Workflows workers to pick up work and the Workflows backend to enqueue work on Temporal Cloud.
Your deployment must be able to connect through the public internet connection as Temporal Cloud does not have a static IP range.
Follow the steps for configuring either a Temporal Cloud cluster or a self-hosted cluster in your VPC.
Temporal Cloud
Allow your deployment to connect to Temporal
Open up egress to the public internet on ports 443
and 7233
to allow outbound-only connections to Temporal Cloud from your deployment. This allows Workflows workers to pick up work and the Workflows backend to enqueue work on Temporal Cloud.
Your deployment must be able to connect through the public internet connection as Temporal Cloud does not have a static IP range.
Configure environment variables for Temporal cluster
Set the following environment variables in MAIN_BACKEND
and WORKFLOW_TEMPORAL_WORKER
services the configuration file.
Temporal Cloud requires security certificates for secure access.
Variable | Description |
---|---|
WORKFLOW_TEMPORAL_CLUSTER_NAMESPACE | The namespace in your Temporal cluster for each Retool deployment you have (e.g., retool-prod ). Default is workflows . |
WORKFLOW_TEMPORAL_CLUSTER_FRONTEND_HOST | The frontend host of the cluster. |
WORKFLOW_TEMPORAL_CLUSTER_FRONTEND_PORT | The port with which to connect. Default is 7233 . |
WORKFLOW_TEMPORAL_TLS_ENABLED | Whether to enable mTLS. Set to true . |
WORKFLOW_TEMPORAL_TLS_CRT | The base64-encoded mTLS certificate. |
WORKFLOW_TEMPORAL_TLS_KEY | The base64-encoded mTLS key. |
Self-hosted
If you use a PostgreSQL database as a persistence store, the PostgreSQL user must have permissions to CREATE DATABASE
. If this is not possible, you can manually create the required databases in your PostgreSQL cluster: temporal
and temporal_visibility
.
Configure environment variables for Temporal cluster
Set the following environment variables for MAIN_BACKEND
and WORKFLOW_TEMPORAL_WORKER
services, if not already configured.
Variable | Description |
---|---|
WORKFLOW_TEMPORAL_CLUSTER_NAMESPACE | The namespace in your Temporal cluster for each Retool deployment you have (e.g., retool-prod ). Default is workflows . |
WORKFLOW_TEMPORAL_CLUSTER_FRONTEND_HOST | The frontend host of the cluster. |
WORKFLOW_TEMPORAL_CLUSTER_FRONTEND_PORT | The port with which to connect. Default is 7233 . |
WORKFLOW_TEMPORAL_TLS_ENABLED | (Optional) Whether to enable mTLS. |
WORKFLOW_TEMPORAL_TLS_CRT | (Optional) The base64-encoded mTLS certificate. |
WORKFLOW_TEMPORAL_TLS_KEY | (Optional) The base64-encoded mTLS key. |
Use the default values in the configuration.
7. Start Self-hosted Retool
Run sudo docker-compose up -d
to start Self-hosted Retool. This can take several minutes as the deployment performs initial setup and first starts its services. You can confirm that the deployment containers are running with sudo docker-compose ps
Once running, Self-hosted Retool is available at http://<your-ec2-ip-address>/auth/signup
. When you first visit the page, you must create an admin account.
Additional steps
On production instances, you should:
- Externalize your PostgreSQL database to a managed service.
- Set up SSL on your deployment.
- Keep up-to-date with the newer versions of Retool.
For ease of use and as a proof of concept, the default docker-compose
configuration includes a PostgreSQL container, and it does not set up SSL. This is not suitable for production use cases, and you should host the Retool storage database on an external, managed database. Managed databases are more maintainable, scalable, and reliable than containerized PostgreSQL instances. Follow the instructions in the external storage database guide to configure your database.
Setting environment variables is also often necessary to configure SSO, source control, and other self-hosted features. See the environment variable reference for additional configuration options.
Update Retool
Before updating your deployment, create a backup of the instance. AWS provides a method to back up your EC2 instance with an AMI. If you don't perform a full backup, you should at least:
- Create a snapshot of your PostgreSQL database.
- Copy the environment variables in
docker.env
to a secure location outside of Retool.
To update your deployment to a newer version of Self-hosted Retool, first update your Dockerfile
with the newer version number. Self-hosted Retool uses semantic versioning with MAJOR.MINOR.PATCH
version numbers. For example, to update your deployment to 3.22.1, specify:
FROM tryretool/backend:3.22.1
Next, run the update_retool.sh
script.
./update_retool.sh
Retool instances temporarily go down while they upgrade. You can check the status of your containers with sudo docker-compose ps
.