Reputation: 6449
Is it possible to get the content of a URL with PHP (using some sort of function like file_get_contents
or header
) but only after the execution of some JavaScript code?
Example:
mysite.com has a script that does loadUrlAfterJavascriptExec('http://exampletogetcontent.com/')
and prints/echoes the content. imagine that some jQuery runs on http://exampletogetcontent.com/
that changes DOM, and loadUrlAfterJavascriptExec
will get the resulting HTML
Can we do that?
Just to be clear, what I want is to get the content of a page through a URL, but only after JavaScript runs on the target page (the one PHP is getting its content).
I am aware PHP runs before the page is sent to the client, and JS only after that, but thought that maybe there was an expert workaround.
Upvotes: 21
Views: 41427
Reputation: 1734
I think the easiest and best way is using this package https://github.com/spatie/browsershot just install it completely and use the below code
Browsershot::url('https://example.com')->bodyHtml()
Upvotes: 2
Reputation: 821
I found a fantastic page on this, it's an entire tutorial on how to process the DOM of a page in PHP which is entirely created using javascript.
https://www.jacobward.co.uk/using-php-to-scrape-javascript-jquery-json-websites/ "PhantomJS development is suspended until further notice" so that option isn't a good one.
Upvotes: 3
Reputation: 5208
Update 2 Adds more details on how to use phantomjs
from PHP.
Update 1 (after clarification that javascript on target page need to run first)
1. Download phantomjs and place the executable in a path that your PHP binary can reach.
2. Place the following 2 files in the same directory:
get-website.php
<?php
$phantom_script= dirname(__FILE__). '/get-website.js';
$response = exec ('phantomjs ' . $phantom_script);
echo htmlspecialchars($response);
?>
get-website.js
var webPage = require('webpage');
var page = webPage.create();
page.open('http://google.com/', function(status) {
console.log(page.content);
phantom.exit();
});
3. Browse to get-website.php
and the target site, http://google.com
contents will return after executing inline javascript. You can also call this from a command line using php /path/to/get-website.php
.
/get-website.php
<?php
$html=file_get_contents('http://google.com');
echo $html;
?>
test.html
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>on demo</title>
<style>
p {
color: red;
}
span {
color: blue;
}
</style>
<script src="https://code.jquery.com/jquery-1.10.2.js"></script>
</head>
<body>
<button id='click_me'>Click me</button>
<span style="display:none;"></span>
<script>
$( "#click_me" ).click(function () {
$.get("/get-website.php", function(data) {
var json = {
html: JSON.stringify(data),
delay: 1
};
alert(json.html);
});
});
</script>
</body>
</html>
Upvotes: 18
Reputation: 747
All the PHP runs before the information is sent to the client. All the JavaScript runs after the information is sent to the client.
To do something with PHP after the page loads, the page will need to either
Since the data appears to be in different file than your PHP anyway, this is a pretty good solution. Since you tagged it jQuery, I assume you're using it.
jQuery has a set of pages about how it implements Ajax
But the easiest way to use jQuery for this is .post
ex:
$.post( "http://example.com/myDataFile.txt", function( data ) {
//do more JavaScript stuff with the data you just retrieved
});
$.post()
, as the name implies, can send data along with the request for the data file, so if that request is to, say, a PHP file, the PHP file can use that data.
ex:
$.post( "http://example.com/myDataFile.txt",
{ foo: "bar"; yabba: "dabba" },
function( data ) {
//do more JavaScript stuff with the data you just retrieved
});
the data should be in JSON format in key/value pairs.
Upvotes: -1