Reputation: 391
My specs will pass in master branch. If I create a new branch and modify some code completely unrelated to the subscriptions, they'll fail with this. The only way I can get them to pass is to change my vcr.rb to have :record => :new_episodes
.
If I leave that option on, then almost every time my specs run I have new modified data files for cassettes that end up being committed which really dilute the logs for Git.
Any suggestions on how to handle this? A lot of the specs that break are based on this matcher:
describe "#change_plan_to", vcr: {match_requests_on: [:method, :uri, :body]} do
Is this matcher too open to changes? I wasn't able to get the specs to pass any other way with stripe api calls.
Failure/Error: @subscription.create_stripe_customer
VCR::Errors::UnhandledHTTPRequestError:
================================================================================
An HTTP request has been made that VCR does not know how to handle:
POST https://api.stripe.com/v1/customers
VCR is currently using the following cassette:
- /Users/app/spec/data/Subscription/_change_plan_to/stripe_customer_subscription_plan_/name/.json
- :record => :once
- :match_requests_on => [:method, :uri, :body]
Under the current configuration VCR can not find a suitable HTTP interaction
to replay and is prevented from recording new requests. There are a few ways
you can deal with this:
* If you're surprised VCR is raising this error
and want insight about how VCR attempted to handle the request,
you can use the debug_logger configuration option to log more details [1].
* You can use the :new_episodes record mode to allow VCR to
record this new request to the existing cassette [2].
* If you want VCR to ignore this request (and others like it), you can
set an `ignore_request` callback [3].
* The current record mode (:once) does not allow new requests to be recorded
to a previously recorded cassette. You can delete the cassette file and re-run
your tests to allow the cassette to be recorded with this request [4].
* The cassette contains 109 HTTP interactions that have not been
played back. If your request is non-deterministic, you may need to
change your :match_requests_on cassette option to be more lenient
or use a custom request matcher to allow it to match [5].
[1] https://www.relishapp.com/vcr/vcr/v/2-5-0/docs/configuration/debug-logging
[2] https://www.relishapp.com/vcr/vcr/v/2-5-0/docs/record-modes/new-episodes
[3] https://www.relishapp.com/vcr/vcr/v/2-5-0/docs/configuration/ignore-request
[4] https://www.relishapp.com/vcr/vcr/v/2-5-0/docs/record-modes/once
[5] https://www.relishapp.com/vcr/vcr/v/2-5-0/docs/request-matching
================================================================================
# ./app/models/subscription.rb:83:in `create_stripe_customer'
# ./spec/models/subscription_spec.rb:68:in `block (3 levels) in <top (required)>'
# -e:1:in `<main>'
I figured out some more. The specs only break when I add a new spec to the stack. How come they break when more things are added to the suite?
Upvotes: 4
Views: 7625
Reputation: 21810
The behavior you are seeing suggests that the one of the attributes used to match requests is non-deterministic and changes each time you run your tests. You mention using the match_requests_on: [:method, :uri, :body]
option -- I'm guessing it's the body
. Bear in mind that VCR's built-in body matcher does a direct body_string == body_string
comparison, and it's easy to have situations where the bodies are semantically equivalent but aren't the same string. For example, a JSON string like {"a": 1", "b": 2}
vs {"b": 2, "a": 1}
.
My suggestion is to not match on body
at all: it's there if you need it in certain situations, but if you match more leniently, VCR usually works fine, since it records the HTTP interactions in the order they originally occur, and then, during playback, it plays back the first unused matched interaction -- which, if your test makes the requests in the same order it originally did, will cause the correct response to be played back for each request.
To get more insight into exactly what's happening, you can use the debug logger option, which will give you detailed output as it is trying to match, to show why it is doing what it is doing.
Upvotes: 5