3

I'm trying to use the like function on a Column with another Column. Is it possible to use Column inside the like function?

sample code:

df['col1'].like(concat('%',df2['col2'], '%'))

Error log:

py4j.Py4JException: Method like([class org.apache.spark.sql.Column]) does not exist at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:318) at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:326) at py4j.Gateway.invoke(Gateway.java:274) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:214) at java.lang.Thread.run(Thread.java:748)

0

1 Answer 1

5

You can do it using a SQL expression instead. For some reason the python API doesn't directly support it. For example:

from pyspark.sql.functions import expr

data = [
    ("aaaa", "aa"),
    ("bbbb", "cc")
]

df = sc.parallelize(data).toDF(["value", "pattern"])
df = df.withColumn("match", expr("value like concat('%', pattern, '%')"))
df.show()

Outputs this:

+-----+-------+-----+
|value|pattern|match|
+-----+-------+-----+
| aaaa|     aa| true|
| bbbb|     cc|false|
+-----+-------+-----+
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.