How to install packages which are not available when a Spark script is written and run using on-demand HDinsight cluster via Azure Data Factory ADF ?
There is an old question here but it wasn't answered. Custom script action in Azure Data Factory HDInsight Cluster
How to do a pip install inside my pyspark script ? or any other way?
My pyspark script is running on the on-demand hdinsight cluster via ADF loading data from csv blob to Azure MySQL [for a proof of concept scenario, so have to stick with hdinsight only for now, no databricks]