G. Bunino
G. Bunino

Reputation: 13

Number of dummy points in spatstat

I'm fitting a inhomogeneous Poisson model to a spatial point pattern dataset with function ppm (spatstat package) that uses Berman-Turner quadrature approximation for estimating the parameters by MLE. By default, for computational reasons, the dummy points are arranged in a rectangular grid and the quadrature weights are determined dividing the observation window into a grid of rectangular tiles (the number of tiles is equal to the number of dummy points).

Considering a square window, the number of dummy points generated automatically is a piecewise constant function (I suppose) that depends only on the number of data points (and not on the dimension of the window); more precisely:

  number of   │  number of
 data points  │ dummy points
 (intervals)  │  generated
──────────────────────────────
    0 -  225  │ 1028
  226 -  400  │ 1604   (4*401)
  401 -  625  │ 2504   (4*626)
  626 -  900  │ 3604   (4*901)
  901 - 1225  │ 4904  (4*1226)
     etc.     │     etc.

My questions

Upvotes: 1

Views: 201

Answers (1)

Adrian Baddeley
Adrian Baddeley

Reputation: 2973

I am the author of this code. The code is extensively documented in help(ppm) and help(quadscheme). These help files provide references to papers and books which include information about the design of quadrature schemes: including the paper by Berman and Turner (1992), by Baddeley and Turner (2000) and the book by Baddeley, Rubak and Turner (2015, see chapter 9).

The main reasons for choosing these default rules are:

  • For all contributed R packages, the default settings must allow the code to be tested in a reasonable time (maximum 5 seconds per help page; average 1 second per help page) on a variety of hardware by CRAN. This means that the default rules must be simple.
  • It is desirable that when ppm is fitted to slightly different point patterns, the results should be similar, which is easier to achieve if the quadrature schemes are similar, so the rules should be stable.

In the default rule, the number of tiles is not equal to the number of dummy points. The rule is designed so that, as the number of data points increases, the number of dummy points per tile will slowly increase.

These are the default rules and we encourage users to develop their own quadrature schemes or to at least increase the density of dummy points when conducting a definitive analysis.

Upvotes: 1

Related Questions