Reputation: 4875
I have a data versioning system implemented as a point-in-time architecture in some RDBMS. I am writing a servlet based API to expose some functionality regarding this data. The API will return datapoints to the user, allow the user to tag data for removal, and allow a super user to accept or reject these data modification requests, also done via API calls.
Here's the question. I have dealt with some large and noteworthy API with a very diverse feature set in which all API calls are HTTPS GETs. This is how I plan on doing this. I know I know, in a perfect world if you are developing a resource oriented product you should design an ROA implemented as a REST interface. However, the client really wishes to have a more hybrid RPC style interface for readability and low learning curve. If I do everything in terms of GET to get the API calls in the format the client wants, is this a bad thing? Is something going to come back and bite me in the rear end later? Are there bad implications for future API additions or maintenance? If there are some flagrant pitfalls the this approach, the client can swayed in another direction without too much fuss.
One of the reasons I don't just want to do REST and use GET/POST/etc http verbs is that lower privilege users can only make change requests. These linger around until a higher privilege user okays/rejects them.
Example call: somehost/?Method=GetOutlierData&SiteId=112-1&TimeInterval=2011-01-01_00:00:00--2011-01-01_23:59:59&ValidDate=2011-02-01_12:00:00&ReturnType=RecordId&Requester=13&Password=secret&ApiKey=19483
Response: Return=0&RecordsIds=
Another place I'm not sure if what I am proposing is a good idea is authentication. Every call includes the calling user's credentials (to enforce roles -- only some API features are available to certain users). The API will only process calls from hosts on a whitelist, so the design of the API implies that the client will construct a single endpoint to route all their organizations requests through, and that this endpoint will supply the secret API key along with all API calls. This will prevent users from sending their own unapproved calls directly to the API. Their will be some rate limiting and banning features implemented internally to prevent intentional and accidental DoS. Since we are operating over SSL is this an adequate way to do things?
Example call: somehost/?Method=blah&...&Requester=13&Password=secret&ApiKey=1298593
Upvotes: 1
Views: 301
Reputation: 169373
GET requests are idempotent. They are safe.
Browsers, Bots, Crawlers they will all assume sending GET requests is a safe action and will not delete or alter data.
This goes againts the design of HTTP.
In an ideal world you convince your client he wants a REST service.
Failing that require any non-idempotent calls to be POST rather then GET
Upvotes: 1
Reputation: 54790
There's nothing inherently 'bad' about using GET requests for non-idempotent operations, for example SOAP uses GET (or POST?) for all operations. Some web browsers, however, will view a link that uses GET as idempotent and try to pre-fetch it for you. This is a bad thing of course if that link was to perform a delete row on your back end database.
Upvotes: 2