15

I want to split dataframe by uneven number of rows using row index.

The below code:

groups = df.groupby((np.arange(len(df.index))/l[1]).astype(int))

works only for uniform number of rows.

df

a b c  
1 1 1  
2 2 2  
3 3 3  
4 4 4  
5 5 5  
6 6 6  
7 7 7  

l = [2, 5, 7]

df1  
1 1 1  
2 2 2  

df2  
3,3,3  
4,4,4  
5,5,5  

df3  
6,6,6  
7,7,7  

df4  
8,8,8
3
  • have you tried df.loc? Commented Nov 20, 2018 at 11:10
  • Do you want to split randomly or do you have some set of indexes you'd like to split with? Commented Nov 20, 2018 at 11:12
  • Not random, I would like split based on array l. First 2 rows then from 3rd to 5th row and so on Commented Nov 21, 2018 at 7:37

4 Answers 4

27

You could use list comprehension with a little modications your list, l, first.

print(df)

   a  b  c
0  1  1  1
1  2  2  2
2  3  3  3
3  4  4  4
4  5  5  5
5  6  6  6
6  7  7  7
7  8  8  8


l = [2,5,7]
l_mod = [0] + l + [max(l)+1]

list_of_dfs = [df.iloc[l_mod[n]:l_mod[n+1]] for n in range(len(l_mod)-1)]

Output:

list_of_dfs[0]

   a  b  c
0  1  1  1
1  2  2  2

list_of_dfs[1]

   a  b  c
2  3  3  3
3  4  4  4
4  5  5  5

list_of_dfs[2]

   a  b  c
5  6  6  6
6  7  7  7

list_of_dfs[3]

   a  b  c
7  8  8  8
Sign up to request clarification or add additional context in comments.

2 Comments

Correct me if I'm wrong, but I think the modified list should be: l_mod = [0] + l + [len(df)]. Now, in this instance, max(l)+1 and len(df) coincide, but if generalised you might lose rows. And as a second note, it could be worth passing it on set to ensure that no duplicate indicies exist (like having [0] 2 times). Great solution btw, you got my upvote :)
@N1h1l1sT Thanks. Yes, I think you are correct in for generalization. Maybe, you could be using this original list to filter the dataframe also, but I agree with your assumptions here.
5

I think this is what you need:

df = pd.DataFrame({'a': np.arange(1, 8),
                  'b': np.arange(1, 8),
                  'c': np.arange(1, 8)})
df.head()
    a   b   c
0   1   1   1
1   2   2   2
2   3   3   3
3   4   4   4
4   5   5   5
5   6   6   6
6   7   7   7

last_check = 0
dfs = []
for ind in [2, 5, 7]:
    dfs.append(df.loc[last_check:ind-1])
    last_check = ind

Although list comprehension are much more efficient than a for loop, the last_check is necessary if you don't have a pattern in your list of indices.

dfs[0]

    a   b   c
0   1   1   1
1   2   2   2

dfs[2]

    a   b   c
5   6   6   6
6   7   7   7

Comments

2

I think this is you are looking for.,

l = [2, 5, 7]
dfs=[]
i=0
for val in l:
    if i==0:
        temp=df.iloc[:val]
        dfs.append(temp)
    elif i==len(l):
        temp=df.iloc[val]
        dfs.append(temp)        
    else:
        temp=df.iloc[l[i-1]:val]
        dfs.append(temp)
    i+=1

Output:

   a  b  c
0  1  1  1
1  2  2  2
   a  b  c
2  3  3  3
3  4  4  4
4  5  5  5
   a  b  c
5  6  6  6
6  7  7  7

Another Solution:

l = [2, 5, 7]
t= np.arange(l[-1])
l.reverse()
for val in l:
    t[:val]=val
temp=pd.DataFrame(t)
temp=pd.concat([df,temp],axis=1)
for u,v in temp.groupby(0):
    print v

Output:

   a  b  c  0
0  1  1  1  2
1  2  2  2  2
   a  b  c  0
2  3  3  3  5
3  4  4  4  5
4  5  5  5  5
   a  b  c  0
5  6  6  6  7
6  7  7  7  7

Comments

1

You can create an array to use for indexing via NumPy:

import pandas as pd, numpy as np

df = pd.DataFrame(np.arange(24).reshape((8, 3)), columns=list('abc'))

L = [2, 5, 7]
idx = np.cumsum(np.in1d(np.arange(len(df.index)), L))

for _, chunk in df.groupby(idx):
    print(chunk, '\n')

   a  b  c
0  0  1  2
1  3  4  5 

    a   b   c
2   6   7   8
3   9  10  11
4  12  13  14 

    a   b   c
5  15  16  17
6  18  19  20 

    a   b   c
7  21  22  23 

Instead of defining a new variable for each dataframe, you can use a dictionary:

d = dict(tuple(df.groupby(idx)))

print(d[1])  # print second groupby value

    a   b   c
2   6   7   8
3   9  10  11
4  12  13  14

1 Comment

avoid groupby if splitting indices are known a priori

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.