Re: trying to retrieve comments with activated API key

2019-03-08 Thread Drake Gossi
is mean I can add on the loop? that is, to get all 32000? and is this in JSON format? it has to be, right? eventually I'd like it to be in csv, but that's because I assume I have to manipulate it was R later... D On Fri, Mar 8, 2019 at 12:54 PM Chris Angelico wrote: > On Sat, M

Re: trying to retrieve comments with activated API key

2019-03-08 Thread Drake Gossi
popped up in what I assume is the environment--I'm running this in spyder--I assume it didn't work. Drake On Fri, Mar 8, 2019 at 12:29 PM Chris Angelico wrote: > On Sat, Mar 9, 2019 at 7:26 AM Drake Gossi wrote: > > > > Yea, it looks like the request url is: > > &g

trying to retrieve comments with activated API key

2019-03-08 Thread Drake Gossi
Hi everyone, I'm further along than I was last time. I've installed python and am running this in spyder. This is the code I'm working with: import requests import csv import time import sys api_key = 'my api key' docket_id = 'ED-2018-OCR-0064' total_docs = 32068 docs_per_page = 1000 Where the "

trying to begin a code for web scraping

2019-02-18 Thread Drake Gossi
Hi everyone, I'm trying to write code to scrape this website ( regulations.gov) of its comments, but I'm having trouble figuring out what to link onto in the inspect page (like when I right click on inspect with the mouse). Although I

Scraping multiple web pages help

2019-02-18 Thread Drake Gossi
Hello everyone, For a research project, I need to scrape a lot of comments from regulations.gov https://www.regulations.gov/docketBrowser?rpp=25&so=DESC&sb=commentDueDate&po=0&dct=PS&D=ED-2018-OCR-0064 But partly what's throwing me is the url addresses of the comments. They aren't consistent. I