From 3dfb9dc757ca182b514d076483bd71cb0b9f41bf Mon Sep 17 00:00:00 2001 From: Ivan Kozik Date: Tue, 20 Oct 2015 10:50:18 +0000 Subject: [PATCH] README: Remove instructions for grabbing a site that requires a cookie because the --header solution results in the cookie being sent to all domains instead of just the intended domain --- README.md | 20 -------------------- 1 file changed, 20 deletions(-) diff --git a/README.md b/README.md index e5c37b5..3b7ed83 100644 --- a/README.md +++ b/README.md @@ -245,26 +245,6 @@ You may want to pipe the output to `sort` and `less`: -Grabbing a site that requires a cookie ---- -1. Log into the site in Chrome. -2. Open the developer tools with F12. -3. Switch to the **Network** tab of the developer tools. -4. Hit F5 to reload the page. The developer tools will stay open and capture the HTTP requests. -5. Scroll up in the list of network events and click on the first request. -6. In the right pane, click the **Headers** tab. -7. Scroll down to the **Request Headers** section. -8. Copy the **Cookie:** value. -9. Start grab-site with: - -``` -grab-site --wpull-args="--header=\"Cookie: COOKIE_VALUE\"" URL -``` - -Note: do **not** use `document.cookie` in the developer tools **Console** because it does not include `HttpOnly` cookies. - - - Stopping a crawl --- You can `touch DIR/stop` or press ctrl-c, which will do the same. You will