randall Z
randall Z

Reputation: 123

How do php 'daemons' work?

I'm learning php and I'd like to write a simple forum monitor, but I came to a problem. How do I write a script that downloads a file regularly? When the page is loaded, the php is executed just once, and if I put it into a loop, it would all have to be ran before the page is finished loading. But I want to, say, download a file every minute and make a notification on the page when the file changes. How do I do this?

Upvotes: 1

Views: 233

Answers (3)

Rob
Rob

Reputation: 48369

Others have already suggested using a periodic cron script, which I'd say is probably the better option, though as Paul mentions, it depends upon your use case.

However, I just wanted to address your question directly, which is to say, how does a daemon in PHP work? The answer is that it works in the same way as a daemon in any other language - you start a process which doesn't end immediately, and put it into the background. That process then polls files or accepts socket connections or somesuch, and in so doing, accepts some work to do.

(This is obviously a somewhat simplified overview, and of course you'd typically need to have mechanisms in place for process management, signalling the service to shut down gracefully, and perhaps integration into the operating system's daemon management, etc. but the basics are pretty much the same.)

Upvotes: 1

Luuk
Luuk

Reputation: 14929

How do I write a script that downloads a file regularly?

there are shedulers to do that, like 'cron' on linux (or unix)

When the page is loaded, the php is executed just once,

just once, just like the index.php of your site....

If you want to update a page which is show in a browser than you should use some form of AJAX, if you want something else than your question is not clear to /me......

Upvotes: 0

Pascal MARTIN
Pascal MARTIN

Reputation: 401022

Typically, you'll act in two steps :

  • First, you'll have a PHP script that will run every minute -- using the crontab
    • This script will do the heavy job : downloading and parsing the page
    • And storing some information in a shared location -- a database, typically
  • Then, your webpages will only have to check in that shared location (database) if the information is there.


This way, your webpages will always work :

  • Even if there are many users, only the cronjob will download the page
  • And even if the cronjob doesn't work for a while, the webpage will work ; worst possible thing is some information being out-dated.

Upvotes: 1

Related Questions