Spark-Pinot Connector

Use the Spark-Pinot connector to read data from and write data to Pinot.

Spark-pinot connector to read data from Pinot.

Detailed read model documentation is here: Spark Pinot Connector Read Model

The write model is Experimental and the documentation is here: Spark Pinot Connector Write Model

Features

  • Query realtime, offline or hybrid tables

  • Distributed, parallel scan

  • Streaming reads using gRPC (optional)

  • SQL support instead of PQL

  • Column and filter push down to optimize performance

  • Overlap between realtime and offline segments is queried exactly once for hybrid tables

  • Schema discovery

    • Dynamic inference

    • Static analysis of case class

  • Supports query options

  • HTTPS/TLS support for secure connections

Quick Start

Security Configuration

You can secure both HTTP and gRPC using a unified switch or explicit flags.

  • Unified: set secureMode=true to enable HTTPS and gRPC TLS together (recommended)

  • Explicit: set useHttps for REST and grpc.use-plain-text=false for gRPC

Quick examples

HTTPS Configuration

When HTTPS is enabled (either via secureMode=true or useHttps=true), you can configure keystore/truststore as needed:

HTTPS Configuration Options

Option
Description
Required
Default

secureMode

Unified switch to enable HTTPS and gRPC TLS

No

false

useHttps

Enable HTTPS connections (overrides secureMode for REST)

No

false

keystorePath

Path to client keystore file (JKS format)

No

None

keystorePassword

Password for the keystore

No

None

truststorePath

Path to truststore file (JKS format)

No

None

truststorePassword

Password for the truststore

No

None

Note: If no truststore is provided when HTTPS is enabled, the connector will trust all certificates (not recommended for production use).

Authentication Support

The connector supports custom authentication headers for secure access to Pinot clusters:

Authentication Configuration Options

Option
Description
Required
Default

authHeader

Custom authentication header name

No

Authorization (when authToken is provided)

authToken

Authentication token/value

No

None

Note: If only authToken is provided without authHeader, the connector will automatically use Authorization: Bearer <token>.

Pinot Proxy Support

The connector supports Pinot Proxy for secure cluster access where the proxy is the only exposed endpoint. When proxy is enabled, all HTTP requests to controllers/brokers and gRPC requests to servers are routed through the proxy.

Proxy Configuration Examples

Proxy Configuration Options

Option
Description
Required
Default

proxy.enabled

Use Pinot Proxy for controller and broker requests

No

false

Note: When proxy is enabled, the connector adds FORWARD_HOST and FORWARD_PORT headers to route requests to the actual Pinot services.

gRPC Configuration

The connector supports comprehensive gRPC configuration for secure and optimized communication with Pinot servers.

gRPC Configuration Examples

gRPC Configuration Options

Option
Description
Required
Default

grpc.port

Pinot gRPC port

No

8090

grpc.max-inbound-message-size

Max inbound message bytes when init gRPC client

No

128MB

grpc.use-plain-text

Use plain text for gRPC communication (overrides secureMode for gRPC)

No

true

grpc.tls.keystore-type

TLS keystore type for gRPC connection

No

JKS

grpc.tls.keystore-path

TLS keystore file location for gRPC connection

No

None

grpc.tls.keystore-password

TLS keystore password

No

None

grpc.tls.truststore-type

TLS truststore type for gRPC connection

No

JKS

grpc.tls.truststore-path

TLS truststore file location for gRPC connection

No

None

grpc.tls.truststore-password

TLS truststore password

No

None

grpc.tls.ssl-provider

SSL provider

No

JDK

grpc.proxy-uri

Pinot Rest Proxy gRPC endpoint URI

No

None

Note: When using gRPC with proxy, the connector automatically adds FORWARD_HOST and FORWARD_PORT metadata headers for proper request routing.

Example run with spark-shell

There are examples under https://github.com/apache/pinot/tree/master/pinot-connectors/pinot-spark-3-connector/examples .

Prerequisites

  • Apache Spark 3.x installed and spark-shell available in your PATH.

  • Setup PINOT_HOME env variable:

  • The Pinot Spark 3 Connector shaded JAR built and available at:

  • Example Scala script located at:

Scala Script to read data from Pinot Proxy

Run with spark-shell

Launch the example in spark-shell with the following command:

Sample output

Example run with spark-submit

You can run the examples locally (e.g. using your IDE) in a standalone mode by starting a local Pinot cluster. See: https://docs.pinot.apache.org/basics/getting-started/running-pinot-locally

You can also run the tests in cluster mode using following command:

This example demonstrates how to use the Pinot Spark 3 Connector to read data from a Pinot cluster via a proxy with authentication token support.

Security Best Practices

Production HTTPS Configuration

  • Always use HTTPS in production environments

  • Store certificates in secure locations with appropriate file permissions

  • Use proper certificate validation with valid truststore

  • Rotate certificates regularly

Production Authentication

  • Use service accounts with minimal required permissions

  • Store authentication tokens securely (environment variables, secret management systems)

  • Implement token rotation policies

  • Monitor authentication failures

Production gRPC Configuration

  • Enable TLS for gRPC communication in production

  • Use certificate-based authentication when possible

  • Configure appropriate message size limits based on your data

  • Use connection pooling for high-throughput scenarios

Future Works

  • Add integration tests for read operation

  • Add write support(pinot segment write logic will be changed in later versions of pinot)

Last updated

Was this helpful?