6

I have a numpy matrix:

arr = np.array([[2,3], [2,8], [2,3],[4,5]])

I need to create a PySpark Dataframe from arr. I can not manually input the values because the length/values of arr will be changing dynamically so I need to convert arr into a dataframe.

I tried the following code to no success.

df= sqlContext.createDataFrame(arr,["A", "B"])

However, I get the following error.

TypeError: Can not infer schema for type: <type 'numpy.ndarray'>
0

3 Answers 3

8
import numpy as np

#sample data
arr = np.array([[2,3], [2,8], [2,3],[4,5]])

rdd1 = sc.parallelize(arr)
rdd2 = rdd1.map(lambda x: [int(i) for i in x])
df = rdd2.toDF(["A", "B"])
df.show()

Output is:

+---+---+
|  A|  B|
+---+---+
|  2|  3|
|  2|  8|
|  2|  3|
|  4|  5|
+---+---+
Sign up to request clarification or add additional context in comments.

Comments

5

No need to use the RDD API. Simply:

mat = np.random.random((10,3))
cols = ["ColA","ColB","ColC"]
df = spark.createDataFrame(mat.tolist(), cols)
df.show()

Comments

-1
import numpy as np
from pyspark.ml.linalg import Vectors
arr = np.array([[2,3], [2,8], [2,3],[4,5]])
df = np.concatenate(arr).reshape(1000,-1)
dff = map(lambda x: (int(x[0]), Vectors.dense(x[1:])), df)
mydf = spark.createDataFrame(dff,schema=["label", "features"])
mydf.show(5)

1 Comment

I get ValueError: cannot reshape array of size 8 into shape (1000,newaxis) on line 4

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.