I have a dataframe dfmmmIncOther:
dfmmmIncOther=dfmmmIncOther.agg(max("time_res"),min("time_res"),avg("time_res")).withColumn("typestat",lit("IQ_SU5"))
.withColumnRenamed("max(time_res)","delay max")
.withColumnRenamed("min(time_res)","delay min")
.withColumnRenamed("avg(time_res)","delay moy")
Type of time_res is a minutes
I did a function to convert minutes to hours, then I convert it on UDF to use it after:
// convert hours on udf
val convertHours : (Int) => String =(input: Int) => {
val minutes = input%60
val hours = input/60
"%sh:%sm".format(hours,minutes)
}
val udfconvertHours = udf(convertHours)*
I changed the variable dfmmmIncOther, to convert minutes to hours:
dfmmmIncOther=dfmmmIncOther.withColumn("delaymax",udfconvertHours(col("delay max"))).withColumn("delaymin",udfconvertHours(col("delay min"))).withColumn("delaymoy",udfconvertHours(col("delay moy")))
the spark interpreter is return a big exception, I think my fault in syntax but don't know where's exactly.
Some remark from you, I will be appreciate
dfmmnIncOther