Reputation: 4084
Sorry for the broad question, but I simply don't know how better to describe this or even what the action is called. It is akin to a web scraper, but it interacts with the website too.
I'm not trying to do anything nefarious, I simply have 200 queries I need to do on a public website, and I really don't want to type them in one after another. Basically, I just want to use unix tools (lynx maybe? something like that?) or maybe some perl script to go to a website, enter text in the "search" field, press "go", and then save the entire page of results.
Thanks!
Upvotes: 0
Views: 436
Reputation:
perl -MWWW::Mechanize::Shell -e shell
and then: get, fillout, open, submit. When you'll be done with the web interaction, do: script
and save output to whatever.pl
.
Example session:
$ perl -MWWW::Mechanize::Shell -e shell
(no url)>get http://google.com/
http://www.google.pl/>fillout
(text)q> [] depesz
http://www.google.pl/>submit
200
http://www.google.pl/search?ie=ISO-8859-2&hl=pl&source=hp&q=depesz&gbv=1>script
#!/opt/perlbrew/perls/perl-5.18.0/bin/perl -w
use strict;
use WWW::Mechanize;
use WWW::Mechanize::FormFiller;
use URI::URL;
my $agent = WWW::Mechanize->new( autocheck => 1 );
my $formfiller = WWW::Mechanize::FormFiller->new();
$agent->env_proxy();
$agent->get('http://google.com/');
$agent->form_number(1) if $agent->forms and scalar @{$agent->forms};
$formfiller->add_filler( 'q' => Fixed => 'depesz' );$formfiller->fill_form($agent->current_form);
$agent->submit();
Upvotes: 4