compat-settings.xml
and set it in an environment variable before running the checkoutAndBuild.sh
command:--help
option).release-0.7.1
). You can obtain the commit hash from the controller URI /version
.compCheck.sh
command multiple times against the same build, you just need to make sure to provide a new working directory name each time.-k
option to the compCheck.sh
command to keep the cluster (Kafka, Pinot components) running. You can then attempt the operation (e.g. a query) that failed. generationNumber
. Each time data is uploaded, the values written as __GENERATION_NUMBER__
in your input data files (or in the query files) are substituted with a new integer value.__GENERATION_NUMBER__
will be replaced with an integer (each yaml file is one generation, starting with 0).__GENERATION_NUMBER__
with the actual generation number like above.__GENERATION_NUMBER__
, but not find it if your input files do not have the string in them). Another way is to ensure that the set of queries you provide for each phase also includes results from the previous phases. That will make sure that all previously loaded data are also considered in the results when the queries are issued.ORDER BY
is not specified, if there are more than 5 results, there is no guarantee that Pinot will return the same five rows every time. In such a case, you can include all possible values of foo
where x = 7
matches, and indicate that in your result file by specifying isSuperset: true
. An example of this feature is shown below: