1

I have a list of urls and I want my code to loop through multiple pages of these multiple urls

urls = ['https://www.f150forum.com/f118/2019-adding-adaptive-cruise-454662/','https://www.f150forum.com/f118/adaptive-cruise-control-sensor-blockage-446041/']

comments = []

for url in urls:
    with requests.Session() as req:
        for item in range(1):
            response = req.get(url+"index{item}/")
            soup = BeautifulSoup(response.content, "html.parser")
            for item in soup.findAll('div',attrs={"class":"ism-true"}):
                result = [item.get_text(strip=True, separator=" ")]
                comments.append(result)


The above code through an error. Can you let me know how to loop through multiple pages. The error I am getting is "NoneType' object has no attribute 'findAll"

5
  • Try scrapy.org Commented Jan 13, 2020 at 20:23
  • 1
    The above code through an error It would help a lot if you showed us the error. Commented Jan 13, 2020 at 20:23
  • I think r.content should be response.content Commented Jan 13, 2020 at 20:25
  • I have updated my question Commented Jan 13, 2020 at 20:32
  • @anonymous13 : how many records are you expecting per page.I am getting 10 records per page for first url and 7 records per page for second url. Commented Jan 13, 2020 at 20:40

1 Answer 1

1

soup can return None. Only continue if soup has a value.

    soup = BeautifulSoup(response.content, "html.parser")
    if soup:  
        for item in soup.findAll('div',attrs={"class":"ism-true"}):
            result = [item.get_text(strip=True, separator=" ")]
            comments.append(result)

Note that response.content is the response in binary, response.text is it in string form. If matching fails at all time, try the string form.

It also looks like you want an f-string for the url, if "item" is a number:

 for item in range(1):
        response = req.get(f"{url}index{item}/")
Sign up to request clarification or add additional context in comments.

1 Comment

Thank you so much f"{url}index{item}/" this worked for me.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.