I have successfully connected to the Twitter API using Python and I have printed out some data to test it out. I only printed 4 tweets with data such as retweet_count and favorite_count. My goal is to find 100 tweets (on a random topic that I chose) and put the data, such as retweet_count, into a database. I have my database set up and I am connected to it, but I am unsure how to insert the data. I also have my table and columns set up.
If I already had the data, and it wasn't from such a large number of tweets, I could do something like this (example from my previous work NOT using API):
f = open("Credit_Info.csv")
for row in csv.reader(f):
x= row[0]
y= row[1]
z= row[2]
sql = "INSERT INTO customers (ID, EDUCATION, SEX) VALUES (%s, %s, %s)"
val = (x,y,z)
mycursor.execute(sql, val)
Since I am getting data from an API, what do I do for val? I cannot type variables for all 100 tweets. Here is what I have so far, minus the part where I connected to the database, imported some things, and authenticated for twitter. I am using 4 tweets instead of 100 to test it out.
n = 4
query = 'bieber'
#Grab "count" num of tweets for "query"
#search function built in twitter
tweets = twitter.search(q=query, count=n)
#how many tweets fetched
u = 0
for tweet in tweets['statuses']:
u = u + 1
print('\n\nTweets retrieved:', u, 'of the total:', n)
for tweet in tweets['statuses']:
print(tweet)
print(tweet['id'])
print(tweet['text'].translate(non_bmp_map))
print(tweet['created_at'])
print(tweet['user']['screen_name'].translate(non_bmp_map))
print(tweet['retweet_count'])
print(tweet['favorite_count'])
print(tweet['lang'])
# below I typed the column names in my table, they are the same as the data from above
sql = "INSERT INTO tweet_data (id, text, created_at, screen_name, retweet_count, favorite_count, lang) VALUES (%s, %s, %s, %s, %s, %s, %s)"
val =
mycursor.execute(sql, val)
For each tweet, I want to parse these related values: id, text, created_at, user’s screen_name, retweet_count, favorite_count, lang, and then add the parsed tweet values to my database.