I was wondering if anyone had this issue when running spark and trying to import numpy. Numpy imports properly in a standard notebook, but when I try importing it via a notebook running spark, I get this error. I have the most recent version of numpy and am running the most recent anaconda python 3.6.
Thanks!
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
in ()
----> 1 import numpy
/Users/michaelthomas/anaconda/lib/python3.6/site-packages/numpy/__init__.py in ()
144 return loader(*packages, **options)
145
--> 146 from . import add_newdocs
147 __all__ = ['add_newdocs',
148 'ModuleDeprecationWarning',
/Users/michaelthomas/anaconda/lib/python3.6/site-packages/numpy/add_newdocs.py in ()
11 from __future__ import division, absolute_import, print_function
12
---> 13 from numpy.lib import add_newdoc
14
15 ###############################################################################
/Users/michaelthomas/anaconda/lib/python3.6/site-packages/numpy/lib/__init__.py in ()
6 from numpy.version import version as __version__
7
----> 8 from .type_check import *
9 from .index_tricks import *
10 from .function_base import *
/Users/michaelthomas/anaconda/lib/python3.6/site-packages/numpy/lib/type_check.py in ()
9 'common_type']
10
---> 11 import numpy.core.numeric as _nx
12 from numpy.core.numeric import asarray, asanyarray, array, isnan, \
13 obj2sctype, zeros
AttributeError: module 'numpy' has no attribute 'core'
numpy.pyin your working directory?