user1971707
user1971707

Reputation: 13

do_gettimeofday() in Beaglebone giving wrong time

I am trying to measure the time period of a square wave on a Beaglebone running Angstrom OS. I have written a kernel driver to register an ISR in which I'm timing the pulses. Everything is working fine, but the time interval being measured is completely wrong. I'm using do_gettimeofday() function to measure the time. When I do the same in userspace program using poll() function, I'm able to achieve correct values (it shows around 1007 us for a 1000us wave), but when I use the driver to measure the pulse, I get the interval as 1923us. I have no idea why the time interval in the kernel is higher than that in user space. I have attached my code below. I would be grateful if someone can find the mistake in my program.

kernel ISR:

static irqreturn_t ISR ( int irq, void *dev_id)
{

prev = c;
do_gettimeofday(&c);

printk(KERN_ALERT "%ld", (c.tv_usec - prev.tv_usec));
return IRQ_HANDLED;
}

userspace prog:

while(1){
    prev = start;
    gettimeofday(&start, NULL);
    rc = poll(&fdset, 1, 20000);
    if(rc < 0){
        printf("Error in rc\n");
        return -1;
    }

    if(rc == 0){
        printf("Timed out\n");
        return -1;
    }

    if (fdset.revents & POLLPRI) {
        len = read(fdset.fd, buf, 2);
        printf("%ld\n", (start.tv_usec - prev.tv_usec));
    }

}

Upvotes: 0

Views: 1615

Answers (1)

Jadon
Jadon

Reputation: 51

For profiling interrupt latency, I find it quite useful to be lazy and to set a GPIO pin then measure the time with an oscilloscope. Probably not the answer you want, but it might help you over a hurdle quickly.

Upvotes: 1

Related Questions