0

In my pyspark DataFrame I have two columns price1 and price2. I want to create a new column result based on the formula ((price1 - price2)/price1). However, I want also to check that neither price1 nor price2 are null, and price1 is not 0.

How can I correctly create a new column using these conditions?

Now I have this:

df = df.withColumn("result", df["price1"]-df["price2"]/df["price1"])
2
  • what result do you expect if price1==0 or if one prices is null? Commented Oct 25, 2017 at 11:07
  • @MaxU: Sorry, I have not specified it. If price1==0 or any price is equal to null, then I expect result to be equal to 0. Commented Oct 25, 2017 at 11:09

3 Answers 3

1

I think you can do it this way:

df = df.withColumn("result", df["price1"]-df["price1"]/df["price2"]).fillna(0)
Sign up to request clarification or add additional context in comments.

Comments

0

If you can use udf,

from pyspark.sql import functions as F

udf = F.udf(lambda x,y : 0 if x == 0 or not all((x,y)) else x-y/x)
df = df.withColumn("result", udf(df["price1"],df["price2"]))

Comments

0
df = df.withColumn("result", 
when(df.col("price1").isNull OR df.col("price2").isNull OR df.col("price1")==0,0)
.otherwise(df.col("price1")-df.col("price2")/df.col("price1")))

This is how it can be done using scala..

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.