I have a dictionary defined as follows:
>>> mydict = {0:obj0,5:obj1,4:obj3,7:obj4}
The dictionary has integer as keys.
I am trying to convert this dict to a numpy array.
so that:
>>> nparray[[4,0]] = [obj3,obj0]
>>> nparray[[7,4]] = [obj4,obj3]
I am aware of numpy structured arrays but unfortunately it seems like integer indexes must correspond to the position in the array as opposed to the key. Is there a way to change this behavior.
I was considering a way to "trick" the numpy array. For example instead of reading [4,0] it reads the rows corresponding to those keys.
My end goal is to have some sort of custom class that inherits from np.ndarray, if there isn't another alternative.
UPDATE
A bit more background, I originally solved this problem by using the class below, which stores the objects:
class MyArray (dict):
def __init__ (self,*args):
dict.__init__(self,*args)
def __getitem__ (self, key):
if not hasattr (key, '__iter__'):
return dict.__getitem__ (self,key)
return List([dict.__getitem__ (self,k) for k in key])
Which allows multi-key indexes. The key array can be very huge (1000000+), and so for k in key can take a long time and/or be expensive. I wanted to use numpy arrays to take advantage of it's speed, lower memory etc.. and wouldn't have to use that for loop. Is it still warranted?
pandas(pandas.pydata.org/pandas-docs/stable). In particular, see the pandasDataFrame: pandas.pydata.org/pandas-docs/stable/dsintro.html#dataframearray[[0, 4]]is going to return an array so when you doarray[[0, 4]].someattryou're going to get an attribute error. You'll end up doing something like[i.someattr for i in array[[0, 4]]]...