I am reading from a Delta Table and performing some column selection and filtering using pyspark.
columns == ['a', 'b', 'c']data = spark.read.load(PATH).select(*columns).where(f.col('a').like('%test%'))
What I would like to obtain is a SQL query representing the above operation, like:
select {*columns}from PATHwhere a like '%test%'
Is there any spark function able to do this?