|
From: | Mike Jackson |
Subject: | Re: advanced HTTP query (cookies) |
Date: | Fri, 26 Aug 2005 07:52:45 -0700 |
Hi all, I am interested in monitoring a web site in a little bit more depth. By testing certain funcitonality through the front end, I can infer that a service suporting the front end is running successfully. I'm also monitoring the back end directly, but I like to get it 'as the customer sees it.' In order to complete this HTTP query, I must include cookies in my request. The following wget line will successfully get as deep as I would like to go:wget --cookies=off --header "Cookie: sh3=id%3D1970583178430e5e1b0a3b72.85050652%3Bcv%3D2%3Brv%3D3d3b01f6%3B; sh2=db%3De8744%3Bcso%3D430e5fcd%3Bslu%3D430e5e1b%3Bref%3Dsh%3B" http://www.simplyhired.com/index.php?ds=sr\&mj=1I must then parse the returned page for "Jobs 1 - 1" to verify success.
FWIW, both to you and the monit developers, cURL would be ideally suited to this task:
http://curl.haxx.se/If you have to exec an external command, cURL makes it very easy to save cookies in files and send GET or POST data (I've used multiple calls to the command-line tool from a perl wrapper to log into several different sites and do whatever I needed to automate, all as if coming from a "real" browser; Amazon's associates site for one, which uses cookies and URL-based session IDs and redirects to stymie that sort of automation), and there's libcurl for implementing these functions within monit. I'd offer to help, but uhh, perl's about as advanced as my programming skills get :)
[Prev in Thread] | Current Thread | [Next in Thread] |