File_get_contents

I want to list 10.000 pages on screen using php function file_get_contents

to get information for db.

This works until about page 500, then the script stops working (page loaded) and does not display an error.

<?php

for ($nr=1; $nr<=10000; $nr++){

$url = "http://site.com/u$nr";
$string = file_get_contents($url);

echo '<textarea>'.$string.'</textarea>';

}

?>

      

@edit

I want this information to be processed with Javascript for my computer, I think it does a great job of manipulating HTML with PHP. Writing these pages to db is a good idea.

+3


source to share


4 answers


I would suggest setting your time_limit inside a loop. Setting the time limit will "extend" the timeout by the specified amount of time;

<?php 
for ($nr=1; $nr<=10000; $nr++){
    $url = "http://site.com/u$nr";
    $string = file_get_contents($url);
    echo '<textarea>'.$string.'</textarea>';

    // extend the time-limit with this amount of time,
    // the amount of time this request is allowed to take
    set_time_limit(5);
}

?>

      



This way you won't need to compute a "total" time limit for 10.000 records and not allow a single request to take (for example) 1 hour to complete

[update] New information posted by OP states that he wants to handle inputs with javascript. So the best solution would be to fetch all 1 pages in one go with jQuery, process the information and send the results to the database using an Ajax message and backend storing it in the database

+2


source


Check error logs. Your script probably timed out (default 30 seconds). Try setting a higher timeout using http://php.net/manual/en/function.set-time-limit.php



set_time_limit(120); // Script may run for 120 seconds before timing out

+1


source


Your PHP script is disabled. The default maximum execution time is 30 seconds. Use this function

set_time_limt

By the way, loading 10,000 pages will take quite a long time (time + bandwidth + resources) and can fail down

0


source


You are probably running out of memory / runtime.

You need to check phpinfo()

and verify the appropriate settings ( memory_limit

must be higher than 256M

, max_execution_time

also must be more than 30 minutes if you intend to use your server for this kind of operation).

Also, make sure that you register your mistakes, and you show them (the development environment) ( display_errors = 1

, error_reporting(E_ALL)

, log_errors = 1

).

Capturing 10,000 pages is a resource consumption process, you may want to store some information in the database.

Edit: As people said, set_time_limit()

this is a "per script" option max_execution_time

, use it if you don't want to constantly change the server runtime.

0


source







All Articles