AnUser1
AnUser1

Reputation: 79

Alternative for mysql Events (not Cronjob)

5 minutes after a specific user action occurred, I need to change some data in my database. Till now I have programmed on my homeserver and I have used mysql Events in order to achieve this, which worked perfectly. But my current provider where I now host my website restricts the use of mysql Events.

Now I have to come up with an alternative for mysql Events.

I thought about executing a php script that changes the data in the background by using exec("php script.php &> /dev/null &"), which actually works (at least on my homeserver) but which is also restricted by my provider.

Now I'm really out of ideas and need your help - how can I do the trick?

Thank you!

Christof

EDIT: Cronjobs don't do the trick in my case, since I have to execute the script only if the specific user action has occurred.

Upvotes: 1

Views: 516

Answers (2)

O. Jones
O. Jones

Reputation: 108641

Ordinarily, the best way to handle this sort of suspense-file database item is to use an effective-time strategy. The best thing about this strategy is this: it doesn't depend on the timing accuracy of your events, or cronjobs, or whatever.

You didn't say what needs to happen to your data five minutes after the user action, so I'll have to guess.

Suppose the operation is to remove a row from table_a and insert a row into table_b. In that case, you'd add a DATETIME or TIMESTAMP column effective_until to table_a and another column effective_after to table_b.

When the user action occurs, you do something like

 UPDATE table_a 
    SET effective_until = NOW() + INTERVAL 5 MINUTE
  WHERE id='whatever';

and

 INSERT INTO table_b  (id, data, effective_after)
               VALUES (whatever, whatelse, NOW() + INTERVAL 5 MINUTE);

Then, later, whenever you use table_a you say

 SELECT ... FROM table_a WHERE (effective_until <= NOW() OR effective_until IS NULL) ;

and when you use table_b you say

 SELECT ... FROM table_a WHERE effective_after > NOW();

Once a day, or once a week, or whatever, you can purge the dead stuff from table_a with something like this.

  DELETE FROM table_a WHERE effective_until <= NOW();

The trick is to mark your rows with timestamps, instead of relying on precisely timed operations. Much more robust and reliable. Resilient when the cronjobs fail for a whole weekend. Also, compatible with cheap hosting services and uncooperative dbas, both of which the world has an abundance.

Upvotes: 1

Drew
Drew

Reputation: 24949

I would consider the following as you have PHP tagged here.

Create an account with aws ec2 and start with their Free-Tier for one year. This will allow the creation of an AMI (Amazon Machine Instance) of your choosing. Typically Linux, but can be a Microsoft variant. Tweak it and save it. Linux is the normal choice due to limited resources (memory) granted. It will be a Small Instance Type for free for 1 year.

Note I have no affiliation with AMZN. It could just as easily be something from Azure/Microsoft or Google. However, I have used ec2 since 2008 and a reasonable person would come to the conclusion that they are several years ahead of their competition. Which might suggest why Openshift and others just punted and layer on top of them anyway. One should read up on IaaS vs PaaS to know that there are complexities in setting up an IaaS solution. It is not something that someone non-technical ought to have recommended to them.

On the PaaS side, I would go with Openshift. There is a lot to be said for those.

I shoot for IaaS due to wanting anything installed on the virtual server as well as my code to listen on any ports and the choice of protocols.

If one then has an ec2 or equivalent virtual server, one is able to harness the programming language of their choice (an agent running in java, c#, python, etc) for a timer to make a call to your shared-host environment PHP (including authentication on https) to trigger the shared host event-like mechanism. Or, better yet, have your mysql instance running on ec2 or RDS.

The reason one should consider the above includes

  • Cutting one's teeth on a platform likely to be used later
  • It beats power failures for the old laptop in your closet you could use otherwise
  • You might end up on the cloud and want these agents to perform Other Tasks
  • By getting off of shared-hosts you do not suffer from cron or event restrictions.

As for Other Tasks you will find that Events have limitations. One of the biggest is the inability to call LOAD DATA INFILE from stored programs. I use agents for that. Your agents can start on o/s startup.

After a trial one can end up using micro instances if still on shared hosts yet integrating with a more powerful virtual server. I estimate it at $12/month, perhaps cheaper, for agents to be running in the cloud 24/7, pointing and doing whatever. Less powerful instance types, but possible. I cannot speak to the power or lack thereof with nano instance types.

Note that an agent can have embedded mysql client libraries (the program does a Using or Import and is a mysql client program). In this case it does direct connects to port 3306. This might work fine for an AWS-only solution, but exposing the db server to the internet is a security risk. If in doubt, one ought to look at the General Query log to see the attacks that mount. The Best Practice is to shield your DB Server from this as hidden behind the firewall. This is why an API like those thru PHP are highly preferred. So, there is nothing keeping you from writing a 100 line agent that does an authenticated POST via https to your shared-host PHP and you take it from there.

Other options include creating SSH Tunnels. But I digress.

Upvotes: 1

Related Questions