Mateusz Urbański
Mateusz Urbański

Reputation: 7862

Web scraping with Capybara

I'm trying to fill and submit form with Capybara. I have a code that looks like this:

session = Capybara::Session.new(:webkit)
session.visit "https://login.microsoftonline.com/login.srf?wa=wsignin1.0&rpsnv=4&ct=1463412429&rver=6.1.6206.0&wp=MCMBI&wreply=https:%2F%2Fwebpooldb41e06.infra.lync.com%2FPassiveAuth%2FPassiveAuth.aspx%3FredirectUrl%3Dhttps%253a%252f%252fwebpooldb41e06.infra.lync.com%252fScheduler%252f&lc=1033&id=266537"
session.fill_in('cred_userid_inputtext', :with => '[email protected]')
session.fill_in('cred_password_inputtext', :with => '12341234')
session.save_page

When I save this page to the file and open it I see that form wasn't filled and submitted...

Any ideas?

Upvotes: 0

Views: 917

Answers (2)

Thomas Walpole
Thomas Walpole

Reputation: 49890

#save_page saves the html of the page, it doesn't save updated element properties which get updated when you fill in text fields. #save_screenshot should show the fields filled in though.

Upvotes: 1

bkunzi01
bkunzi01

Reputation: 4561

You aren't calling submit anywhere in that code. You are simply filling in field values and then calling save page. You need to add a trigger('click') method after you fill in the form to submit it. You should then call the current_url method on your session after that to see if it made it to the next page in your console. In ruby it would simply be:

puts session.current_url

And if all is correct you would see the new url to the page after form submission in your console.

Upvotes: 2

Related Questions