# Plugin reference

## Plugin Families Reference

This section is the reference map for Pinot's built-in plugin families. The detailed family pages live here so the configuration tree can stay dense and the authoring guides can stay elsewhere.

### Plugin Families

| Family                          | Use it for                                                        | Page                                                                                      |
| ------------------------------- | ----------------------------------------------------------------- | ----------------------------------------------------------------------------------------- |
| Stream ingestion connectors     | Kafka, Kinesis, and Pulsar consumer factories                     | [Stream Ingestion Connectors](/reference/plugin-reference/stream-ingestion-connectors.md) |
| Stream connector version matrix | Compatibility between broker, connector, and Kafka major versions | [Stream Connector Version Matrix](/reference/plugin-reference/stream-connector-matrix.md) |
| Metrics plugins                 | JMX metric backends and registry fan-out                          | [Metrics Plugins](/reference/plugin-reference/metrics-plugins.md)                         |
| Environment provider            | Cloud metadata discovery for instance placement                   | [Environment Provider](/reference/plugin-reference/environment-provider.md)               |

### What this page covered

* The plugin families that belong in the configuration reference.
* The detailed pages that live under this subtree.
* The areas where version compatibility matters most.

### Next step

Open the plugin-family page for the integration you are changing, then verify supported versions before changing deployment settings.

### Related pages

* [Configuration Reference](/reference/configuration-reference.md)
* [Stream Ingestion Connectors](/reference/plugin-reference/stream-ingestion-connectors.md)
* [Metrics Plugins](/reference/plugin-reference/metrics-plugins.md)

***

### description: >- Configuration and usage reference for every plugin family in Apache Pinot.

## Plugin Reference

Apache Pinot has a plug-and-play architecture organized into **ten plugin families**. Each family targets a specific extensibility need — from reading data in different formats to exporting metrics to your monitoring stack.

This section covers the **configuration** side of each plugin family: which implementations ship with Pinot, what config keys they accept, and how to enable them. If you want to **write your own plugin**, see the [Plugin Architecture](/develop-and-contribute/plugin-architecture.md) section in the Developer Guide.

### Plugin Families at a Glance

| Plugin Family            | What It Does                                                                           | Config Reference                                                                                                                                                     | Authoring Guide                                                                                                                |
| ------------------------ | -------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------ |
| **Stream Ingestion**     | Consume data from real-time streaming platforms (Kafka, Kinesis, Pulsar)               | [Stream Ingestion Connectors](/reference/plugin-reference/stream-ingestion-connectors.md) · [Version Matrix](/reference/plugin-reference/stream-connector-matrix.md) | [Stream Ingestion Plugin](/develop-and-contribute/plugin-architecture/write-custom-plugins/write-your-stream.md)               |
| **Input Format**         | Read records from files or streams during ingestion (Avro, JSON, Parquet, ORC, CSV, …) | [Input Formats](/build-with-pinot/ingestion/formats-filesystems/pinot-input-formats.md)                                                                              | [Input Format Plugin](/develop-and-contribute/plugin-architecture/write-custom-plugins/record-reader.md)                       |
| **Filesystem**           | Store and fetch segments from pluggable storage backends (S3, GCS, HDFS, ADLS)         | [Filesystem Plugins](/build-with-pinot/ingestion/formats-filesystems/file-systems.md)                                                                                | [Filesystem Plugin](/develop-and-contribute/plugin-architecture/write-custom-plugins/pluggable-storage.md)                     |
| **Batch Ingestion**      | Run data ingestion jobs on different execution frameworks (Standalone, Hadoop, Spark)  | [Batch Ingestion](/build-with-pinot/ingestion/batch-ingestion/batch-ingestion.md)                                                                                    | —                                                                                                                              |
| **Metrics**              | Collect and expose internal JMX metrics via Dropwizard, Yammer, or a compound backend  | [Metrics Plugins](/reference/plugin-reference/metrics-plugins.md)                                                                                                    | [Metrics Plugin](/develop-and-contribute/plugin-architecture/write-custom-plugins/metrics-plugin.md)                           |
| **Segment Writer**       | Programmatically build Pinot segments without a full batch ingestion job               | —                                                                                                                                                                    | [Segment Writer Plugin](/develop-and-contribute/plugin-architecture/write-custom-plugins/segment-writer-plugin.md)             |
| **Segment Uploader**     | Upload completed segment tar files to the Pinot cluster                                | —                                                                                                                                                                    | [Segment Uploader Plugin](/develop-and-contribute/plugin-architecture/write-custom-plugins/segment-uploader-plugin.md)         |
| **Minion Tasks**         | Run background processing tasks on Pinot Minion nodes (merge, purge, compaction, …)    | [Minion](/architecture-and-concepts/components/cluster/minion.md) · [Merge/Rollup Task](/operate-pinot/segment-management/minion-merge-rollup-task.md)               | [Minion Task Plugin](/develop-and-contribute/plugin-architecture/write-custom-plugins/minion-task-plugin.md)                   |
| **Environment**          | Discover cloud-specific instance metadata for failure-domain–aware placement           | [Environment Provider](/reference/plugin-reference/environment-provider.md)                                                                                          | —                                                                                                                              |
| **Time Series Language** | Support custom time series query languages (M3QL, PromQL)                              | —                                                                                                                                                                    | [Time Series Language Plugin](/develop-and-contribute/plugin-architecture/write-custom-plugins/time-series-language-plugin.md) |

***

### Stream Ingestion Connectors

Pinot ships connectors for Apache Kafka (3.x and 4.x), Amazon Kinesis, and Apache Pulsar. Each connector supplies a `StreamConsumerFactory` implementation.

{% content-ref url="/pages/wp7ngUjpYKKfkYsgaIma" %}
[Stream Ingestion Connectors](/reference/plugin-reference/stream-ingestion-connectors.md)
{% endcontent-ref %}

{% content-ref url="/pages/wMhWSeQdcXnTpckZiIEQ" %}
[Stream Connector Version Matrix](/reference/plugin-reference/stream-connector-matrix.md)
{% endcontent-ref %}

### Input Format

Input format plugins read data from files or streams during ingestion. Batch ingestion uses `RecordReader` implementations; real-time ingestion uses `StreamMessageDecoder` implementations. Pinot ships with readers for Avro, CSV, JSON, ORC, Parquet, Thrift, Protobuf, Arrow, CLP-Log, and Confluent Schema Registry variants.

{% content-ref url="/pages/-M8oyQalmSX4AfVP-\_Fq" %}
[Supported Data Formats](/build-with-pinot/ingestion/formats-filesystems/pinot-input-formats.md)
{% endcontent-ref %}

### Filesystem

Filesystem plugins provide a `PinotFS` storage abstraction so that segments can live on different backends — S3, GCS, HDFS, or ADLS.

{% content-ref url="/pages/-M8oyo30JfLVfInxdnwH" %}
[File Systems](/build-with-pinot/ingestion/formats-filesystems/file-systems.md)
{% endcontent-ref %}

### Batch Ingestion

Batch ingestion plugins run ingestion jobs on different execution frameworks: Standalone, Hadoop, and Spark 3.

{% content-ref url="/pages/yTkz28tBnlQKDSrYYwjt" %}
[Batch Ingestion Guide](/build-with-pinot/ingestion/batch-ingestion/batch-ingestion.md)
{% endcontent-ref %}

### Metrics

Metrics plugins control which metrics library Pinot uses for internal JMX metrics. Pinot ships with Yammer (default), Dropwizard, and a Compound implementation that fans out to multiple registries.

{% content-ref url="/pages/HidFQDCYr31yJqq1l3hI" %}
[Metrics Plugins](/reference/plugin-reference/metrics-plugins.md)
{% endcontent-ref %}

### Segment Writer

The Segment Writer plugin provides an API for programmatically collecting `GenericRow` records and building Pinot segments without going through a full batch ingestion job. The built-in file-based implementation buffers rows as Avro records on local disk.

{% content-ref url="/pages/8UT3DZSMuMLCk5ljHwLv" %}
[Segment Writer Plugin](/develop-and-contribute/plugin-architecture/write-custom-plugins/segment-writer-plugin.md)
{% endcontent-ref %}

### Segment Uploader

The Segment Uploader plugin handles uploading completed segment tar files to the Pinot cluster. The default implementation supports all push modes configured via `batchConfigMaps` in the table config.

{% content-ref url="/pages/jxfM2hMiF443YjKNY7p9" %}
[Segment Uploader Plugin](/develop-and-contribute/plugin-architecture/write-custom-plugins/segment-uploader-plugin.md)
{% endcontent-ref %}

### Minion Tasks

Minion task plugins define background processing tasks that run on Pinot Minion nodes. Built-in tasks include MergeRollup, Purge, RealtimeToOfflineSegments, SegmentGenerationAndPush, UpsertCompaction, UpsertCompactMerge, and RefreshSegment.

{% content-ref url="/pages/-M1Swkf2kSXi8fRpwfqz" %}
[Minion](/architecture-and-concepts/components/cluster/minion.md)
{% endcontent-ref %}

{% content-ref url="/pages/jL9ol2AhsYHk6nK6xPf9" %}
[Minion Merge Rollup Task](/operate-pinot/segment-management/minion-merge-rollup-task.md)
{% endcontent-ref %}

### Environment Provider

Environment plugins allow Pinot to discover cloud-specific instance metadata at startup for failure-domain–aware data placement. The Azure provider is the only built-in implementation.

{% content-ref url="/pages/27CbbMvdmMnPt4vJstyL" %}
[Environment Provider](/reference/plugin-reference/environment-provider.md)
{% endcontent-ref %}

### Time Series Language

Time series language plugins let Pinot support custom time series query languages like M3QL and PromQL.

{% content-ref url="/pages/BSdITD6CNryOBlCk7Y7D" %}
[Time Series Language Plugin](/develop-and-contribute/plugin-architecture/write-custom-plugins/time-series-language-plugin.md)
{% endcontent-ref %}

***

### Developing Custom Plugins

Plugins implement interfaces from [pinot-spi](https://github.com/apache/pinot/tree/master/pinot-spi/src/main/java/org/apache/pinot/spi). See the developer guide for the full plugin authoring workflow:

{% content-ref url="/pages/-ME3TJrrDVeegp12y1v3" %}
[Write Custom Plugins](/develop-and-contribute/plugin-architecture/write-custom-plugins.md)
{% endcontent-ref %}

### Legacy compatibility pages

* [STDDEV\_POP](/functions/statistical/stddevpop.md)
* [STDDEV\_SAMP](/functions/statistical/stddevsamp.md)
* [VAR\_POP](/functions/statistical/varpop.md)
* [VAR\_SAMP](/functions/statistical/varsamp.md)


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.pinot.apache.org/reference/plugin-reference.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
