[SUPPORT] --hoodie-conf not overriding value in --props file - deployment with kubernetes operator - org.apache.hudi.utilities.deltastreamer.HoodieMultiTableDeltaStreamer #11085
Labels
hudistreamer
issues related to Hudi streamer (Formely deltastreamer)
priority:critical
production down; pipelines stalled; Need help asap.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
We're having difficulties not to have a property with hardcoded secrets within the props.properties file in kubernetes - this file comes from a config map, and an environment variable is not accepted there - even tho we can use ENV VARS from secrets in "arguments" in the SparkApplication that is deployed through Spark Operator - it seems that the hoodie-conf parameter is not working so in the end the issue persists.
According to code and docs "hoodie-conf" is supposed to override configurations that are within properties file that is passed in "--props" argument
Environment Description
Hudi version :
0.13.1
Spark version :
2.1.3
Running on Docker? (yes/no) :
Yes, deployed in Kubernetes
Additional context
In spark operator this is the part where arguments are passed for the spark-submit job
As you can see the idea is substituting the kafka user and password with --hoodie-conf
Stacktrace
Issue is that this is not being substituted, I tried in both ways, having the property with a dummy value in props.properties, and not having it at all, it doesn't work in any of the two ways
Here is the spark-submit configuration:
If you see above the info is correct on spark-submit, yet the property line being passed with --hoodie-conf is not taking effect.
The props.properties in file:///table_configs/props.properties is being mounted from a config map, like this - in the driver and executor of spark
The config map contains:
The text was updated successfully, but these errors were encountered: