Connect to Kafka
Learn how to connect Kafka to Retool.
You can use the Kafka integration to create a resource and make it available in Retool. Once complete, your users can write queries that interact with Kafka data.
You can use the Kafka integration in Retool Apps and Retool Workflows to produce, consume, or commit messages to a topic.
Requirements
The Kafka integration requirements depend on whether you have a cloud-hosted or self-hosted Retool organization. You may also need to make Kafka configuration changes before creating the resource.
- Cloud-hosted organizations
- Self-hosted organizations
Sufficient user permissions to create resources
All users for Retool organizations on Free or Team plans have global Edit permissions and can add, edit, and remove resources. If your organization manages user permissions for resources, you must be a member of a group with Edit all permissions.
Allow Retool to access the data source
If the data source is behind a firewall or restricts access based on IP address, then you must ensure that your Retool organization can access it. If necessary, configure your data source to allow access from Retool's IP addresses.
3.77.79.248/30
35.90.103.132/30
44.208.168.68/30
3.77.79.249
3.77.79.250
35.90.103.132
35.90.103.133
35.90.103.134
35.90.103.135
44.208.168.68
44.208.168.69
44.208.168.70
44.208.168.71
Retool is building support for querying firewalled resources without allowlisting Retool’s IP address. To learn more or be considered for early access, contact cloud-connect@retool.com.
Kafka settings and authentication
You must have sufficient access and familiarity with your Kafka data source so you can provide:
- Required connection settings (e.g., URL and server variables).
- Authentication credentials (e.g., API keys).
In some cases, you may need to make changes to your Kafka configuration, such as generating authentication credentials or allowing access through a firewall. Refer to the configuration and authentication sections to learn more.
Sufficient user permissions to create resources
All users for Retool organizations on Free or Team plans have global Edit permissions and can add, edit, and remove resources. If your organization manages user permissions for resources, you must be a member of a group with Edit all permissions.
Allow your deployment to access the data source
Your self-hosted deployment must have access to the data source. Ensure that any potential firewall rules for either the data source or your deployment instance are updated to allow them to communicate.
Kafka settings and authentication
You must have sufficient access and familiarity with your Kafka data source so you can provide:
- Required connection settings (e.g., URL and server variables).
- Authentication credentials (e.g., API keys).
In some cases, you may need to make changes to your Kafka configuration, such as generating authentication credentials or allowing access through a firewall. Refer to the configuration and authentication sections to learn more.
CA Certificates
If your Kafka implementation's SSL certificates are signed by an internal CA, your Retool deployment cannot connect until you configure it to trust your CA. You do this by setting NODE_EXTRA_CA_CERTS
to the absolute path of your certificate files. The files need to include one or more trusted certificates in PEM format. For more information, refer to Configure SSL and custom certificates.
Configure the resource
Sign in to your Retool organization and navigate to the Resources tab. Click Create new > Resource, then select Kafka.
Configuration
Specify the name, location, and description to use for your Kafka resource. Retool displays the resource name and type in query editors to help users identify them.
Provide the following configuration settings to create the resource. Depending on how your data source is configured, you may also need to provide optional settings for Retool to connect.
- Cloud-hosted organizations
- Self-hosted organizations
Name
The name to use for the resource.
Description
A description of the resource.
Override default outbound Retool region
Retool connects to your data source from the us-west-2
region. Choosing a different outbound region can improve performance through geographic proximity.
Region | Location |
---|---|
us-west-2 | US West (Oregon) |
eu-central-1 | (Frankfurt, Germany) |
Name
The name to use for the resource.
Description
A description of the resource.
Authentication
The Kafka integration supports the following authentication methods. Depending on which authentication method you use, you may need to make changes to your Kafka configuration.
- Cloud-hosted organizations
- Self-hosted organizations
SASL mechanism
The Simple Authentication and Security Layer (SASL) mechanism. Options include PLAIN
, SCRAM-SHA-256
, and SCRAM-SHA-512
. Defaults to SCRAM-SHA-512
.
Security Protocol
The security protocol. Options include SASL_SSL
, SSL
, SASL_PLAINTEXT
, PLAINTEXT
. Defaults to SASL_SSL
.
Bootstrap Servers
The Kafka brokers in the cluster that you want to connect to, in a comma-separated list.
Username and password
Authentication is performed with a username and password. You must be able to obtain and provide these credentials to create the resource.
SASL mechanism
The Simple Authentication and Security Layer (SASL) mechanism. Options include PLAIN
, SCRAM-SHA-256
, and SCRAM-SHA-512
. Defaults to SCRAM-SHA-512
.
Security Protocol
The security protocol. Options include SASL_SSL
, SSL
, SASL_PLAINTEXT
, PLAINTEXT
. Defaults to SASL_SSL
.
Bootstrap Servers
The Kafka brokers in the cluster that you want to connect to, in a comma-separated list.
Username and password
Authentication is performed with a username and password. You must be able to obtain and provide these credentials to create the resource.
Test the connection
Click Test Connection to verify that Retool can successfully connect to the data source. If the test fails, check the resource settings and try again.
Testing a connection only checks whether Retool can successfully connect to the resource. It cannot check whether the provided credentials have sufficient privileges or can perform every supported action.
Save the resource
Click Create resource to complete the setup. You can then click either Create app to immediately start building a Retool app or Back to resources to return to the list of resources.
Enable the Java DB Connector
For Retool Self-hosted users, Retool provides a Java DBConnector, designed to enhance the performance and stability of certain integrations. This connector is required for the Amazon SQS, Databricks, and Kafka connectors.
The instructions for enabling the Java DB connector differ based on whether you use Helm as a package manager:
- With Helm
- Without Helm
Run helm search repo retool/retool
to check the current version of Retool's Helm chart that is installed. Use helm upgrade
to then upgrade the Helm chart version, if required.
helm upgrade -f values.yaml my-retool retool/retool --version
Next, add the following to values.yaml
:
dbconnector:
java:
enabled: true
Restart your Retool deployment instance for these changes to take effect.
Set the DISABLE_JAVA_DBCONNECTOR
environment variable to true
, then Restart your Retool instance for these changes to take effect.
Query the Kafka resource
When querying the Kafka resource, you can choose from one of three actions:
- Produce messages for a topic
- Consume messages from a topic
- Commit messages in a topic
Consumer groups are automatically deleted after 7 days without use, unless you edit Kafka's offsets.retention.minutes
value.
To create a new Consumer Group
within Retool, click the button and enter the desired value. After refreshing, the new value appears in the dropdown.
Next steps
Your Kafka resource is now ready to use. Check out related queries and code documentation to learn how to interact with Kafka data.
Queries and code quickstart
Fundamental concepts of queries and code.
Resource query tutorial
Hands-on introduction to querying APIs and databases.
Explore database schemas
Learn how to explore database schemas.
Read SQL data
Learn how to retrieve data with SQL.
Write SQL data
Learn how to write data with SQL.