PHP - Refresh Or Timeout On Curl
I'd asked some questions here n answer confirmed stuff n all come to the initial confusion
The problem seems started when I hit "refresh" (when codes changes I usually hit refresh) or there's no activities (means no check whether it's logged or not or do anything in site B)
When it happens, it always prompted to login again...means the check is failing...on the 2nd case it may be caused by timeout setting, but I have no idea on why it's happening by simply refreshing...
Anyone has any idea on what to check in such situation ?
Thanks in advance,
Similar TutorialsHi I'm currently writing a script that basically downloads videos from a specific page. I am downloading with cURL however with some files, they're so large cURL is timing out. This is causing either a) PHP to timeout b) PHP memory to run out c) cURL to stop once defined timeout limit is reached This means that some files are only partitially downloaded as some files are over 100mb and some are only 20mb I have Code: [Select] set_time_limit(0);and Code: [Select] ini_set("memory_limit","500M");set but is there a way to make it so PHP will not timeout and the cURL session will not timeout until the file is downloaded? Hello All, I have a simple upload form which I am using to upload files to Box.net using PHP Curl. It works fine for small files, but times out for larger files. Anyone have any suggestions for this? Thanks, Pete Here is the code: Code: [Select] <html> <body bgcolor="black"> <div align="center"> <img src="Homepage_02.jpg" border="0" /> <br> <br> <font color="#f1ca63"; font face="Arial"; font size="5">Upload</font> <br> <br> <?php if (isset($_POST['upload'])) { if (!empty($_FILES['new_file_1']['name'])) { $allowedExtensions = array("txt","csv","xml","css","doc", "docx","xls","xlsx","rtf","ppt","pdf","swf","flv","avi","wmv","mov","jpg","jpeg","gif","png"); foreach ($_FILES as $file) { if ($file['tmp_name'] > '') { if (!in_array(end(explode(".", strtolower($file['name']))),$allowedExtensions)) { echo $file['name'].' is an invalid file type!<br/>'; } else { $temp_name = $_FILES['new_file_1']['name']; $localfile = $_FILES['new_file_1']['tmp_name']; $file = fopen($localfile,'r'); $request_url = 'https://upload.box.net/api/1.0/upload/[Token Here]/[Folder ID]'; $post_params['check_name_conflict_folder_option'] = urlencode('1'); $post_params['new_file_1'] = "@$localfile"; $post_params['description'] = urlencode($_POST['description']); $post_params['uploader_email'] = urlencode($_POST['uploader_email']); $post_params['upload'] = urlencode('upload'); $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $request_url); curl_setopt($ch, CURLOPT_VERBOSE, 1); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "POST"); curl_setopt($ch, CURLOPT_POSTFIELDS, $post_params); curl_setopt($ch, CURLOPT_TIMEOUT, 300); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); $result = curl_exec($ch); curl_close($ch); $resultArray = explode(' ',$result); if($resultArray[5] != '') { $fileID = substr($resultArray[5],4,-1); $shareName = $temp_name; $link = 'http://www.box.net/shared/'.$shareName; } $renameurl = addslashes("https://www.box.net/api/1.0/rest?action=rename&api_key=[API KEY]&auth_token=[TOKEN Here]&target=file&target_id=".$fileID."&new_name=".$shareName); $renameResult = file_get_contents($renameurl); echo '<font color="white">Upload Successful</font>'; } } } } else { echo '<font color="white">Please select a file</font>'; } } ?> <hr width=600 color=grey> <br> <div align="center"> <form action="box_upload_curl.php" enctype="multipart/form-data" method="post"> <input type="hidden" name="check_name_conflict_folder_option" value="1"/> <table> <tr> <td class="field" style="color: #f1ca63; font-family: Arial; font-size: 14px" width="50%">Choose File to Upload: </td> <td class="input"><input type="file" name="new_file_1" /></td> </tr> <tr> <td class="field field_top" style="color: #f1ca63; font-family: Arial; font-size: 14px" ><br/> Description (optional):</td> <td class="input"><br/><textarea name="description"></textarea></td> </tr> <tr> <td class="field field_top" style="color:#f1ca63; font-family: Arial; font-size: 14px" > <br/> Your e-mail <font color="red">*</font>: </td> <td class="input field_top" style="color: #f1ca63; font-family: Arial; font-size: 14px" > <br/> <input type="text" name="uploader_email" id="email_input"></input> </td> </tr> <tr> <td colspan="2" class="submit" align="center"> <br /> <input type="submit" name="upload" value="Upload" /> </td> </tr> </table> </form> <hr width=600 color="grey"> </div> good day dear community, i am workin on a Curl loop to fetch multiple pages: i have some examples - and a question: Example: If we want to get information from 3 sites with CURL we can do it like so: $list[1] = "http://www.example1.com"; $list[2] = "ftp://example.com"; $list[3] = "http://www.example2.com"; After creating the list of links we should initialize the cURL multi handle and adding the cURL handles. $curlHandle = curl_multi_init(); for ($i = 1;$i <= 3; $i++) $curl[$i] = addHandle($curlHandle,$list[$i]); Now we should execute the cURL multi handle retrive the content from the sub handles that we added to the cURL multi handle. ExecHandle($curlHandle); for ($i = 1;$i <= 3; $i++) { $text[$i] = curl_multi_getcontent ($curl[$i]); echo $text[$i]; } In the end we should release the handles from the cURL multi handle by calling curl_multi_remove_handle and close the cURL multi handle! If we want to another Fetch of sites with cURL-Multi - since this is the most pretty way to do it! Well I am not sure bout the string concatenation. How to do it - Note I want to fetch several hundred pages: see the some details for this target-server sites - /(I have to create a loop over several hundred sites). * siteone.example/?show_subsite=9009 * siteone.example/?show_subsite=9742 * siteone.example/?show_subsite=9871 .... and so on and so forth Question: How to appy this loop into the array of the curl-multi? <?php /************************************\ * Multi interface in PHP with curl * * Requires PHP 5.0, Apache 2.0 and * * Curl * ************************************* * Writen By Cyborg 19671897 * * Bugfixed by Jeremy Ellman * \***********************************/ $urls = array( "siteone", "sitetwo", "sitethree" ); $mh = curl_multi_init(); foreach ($urls as $i => $url) { $conn[$i]=curl_init($url); curl_setopt($conn[$i],CURLOPT_RETURNTRANSFER,1);//return data as string curl_setopt($conn[$i],CURLOPT_FOLLOWLOCATION,1);//follow redirects curl_setopt($conn[$i],CURLOPT_MAXREDIRS,2);//maximum redirects curl_setopt($conn[$i],CURLOPT_CONNECTTIMEOUT,10);//timeout curl_multi_add_handle ($mh,$conn[$i]); } do { $n=curl_multi_exec($mh,$active); } while ($active); foreach ($urls as $i => $url) { $res[$i]=curl_multi_getcontent($conn[$i]); curl_multi_remove_handle($mh,$conn[$i]); curl_close($conn[$i]); } curl_multi_close($mh); print_r($res); ?> I look forward to your ideas. Hi all, Thanks for reading. I'm running a script using jQuery that auto-refreshes a <div> on the index page from an external PHP script to get all the rows in a database and display them on the index page. The script works great - here it is as follows: Code: [Select] <script type="text/javascript"> $(document).ready(function() { $("#responsecontainer").fadeOut("fast").load("getrows.php").fadeIn("slow"); var refreshId = setInterval(function() { $("#responsecontainer").fadeOut("fast").load('getrows.php').fadeIn("slow"); }, 5000); $.ajaxSetup({ cache: false }); }); </script> The getrows.php script I'm working with looks like this: Code: [Select] <?php $rowsQuery = mysql_query("SELECT * FROM Happenings WHERE HappeningDate='$today'"); if (mysql_num_rows($rowsQuery) == 0) { $happeningsToday = "There are no happenings today."; } else { $allHappeningsToday = 1; while ($getHappeningsToday = mysql_fetch_array($rowsQuery)) { $happeningName = stripslashes($getHappeningsToday['HappeningName']); $happeningDate = $getHappeningsToday['HappeningDate']; $happeningDescription = $getHappeningsToday['HappeningDescription']; if ($allHappeningsToday == 1) { $happeningsToday .= " <div class=\"box\"> <p>".$happeningName." | ".$happeningDate." | ".$happeningDescription." </div>"; $allHappeningsToday = 2; } else { $happeningsToday .= " <div class=\"box\"> <p>".$happeningName." | ".$happeningDate." | ".$happeningDescription." </div>"; $allHappeningsToday = 1; } } } echo $happeningsToday; ?> This script works great as well. Currently, the auto-refresh jQuery script as you can see if getting and fading in/out all of the rows. Off of the above getrows.php script, is there a way after I could get only the newly created rows since the last refresh and only fade those in and out while leaving the others already loaded by the auto-refresh script to not fade in/out? Any ideas, thoughts or suggestions would be unbelievably helpful. Thank you very much. I am trying to transfer a large amount of data from a text file exported from a Microsoft Access database into a MSQL database. I have written a script in PHP to open the text file, read a line at a time, sort out the data for insertion into a new table and then insert the data into the table in the MSQL database. This works perfectly for 3 or 100+ records, so the fundamentals appear to be solid. The problem starts when I try to do all the records - about 33,500. I get: Fatal error: Maximum execution time of 30 seconds exceeded in C:\DATA\HTMLServedDocs\MusicIndex\MusicImport.php on line 40. Line 40 is the database write: $result = mysqli_query($dbc, $query); I'm not sure if the root of the error is the Web Server, the Database or something else. I am using the Abyss server, which while it is a bit clunky it is appropriate for the application. Does anyone have any ideas how to get around this problem. I could split the data up into smaller files but that is not ideal. So i have this script that connects to a game data base and it must select and make a top of some columns in data base. The thing is if the data base is offline then the script kinda blocks the website because it has no timeout option. How can i add like a $timeout=2 (in seconds) thing..so if after 2 seconds it gets no connection the script is killed Thanks in advance <?php $database_info = array( 'host' => '___', 'user' => '___', 'password' => '___', 'l2jdb' => '___' ); @mysql_connect($database_info['host'], $database_info['user'], $database_info['password']) or die(mysql_error()); @mysql_select_db($database_info['l2jdb']) or die(mysql_error()); echo "<table class=\"pvp\">"; $query = "SELECT * FROM characters WHERE accesslevel = 0 ORDER BY pvpkills DESC limit 10"; //change 10 to whatever amount you want to show on your website. $result = mysql_query($query) or die(mysql_error()); while ($row = mysql_fetch_array($result)) { $name = $row['char_name']; $pvpscore = $row['pvpkills']; // Inserting data into the table echo " <tr> <td width=\"110\"><strong> $name</strong></td> <td width=\"50\"><div align=\"center\">$pvpscore</div></td> </tr>"; } echo "</table>"; //closing mysql connection mysql_close(); ?> Hi I am trying to create a "User logged" function that tells the database when the user was logged in, I have heard about creating a time out for this, but dont have any idea how to do it or what it does! Any one have any ideas? Thanks Rhys I've been getting this error: Fatal error: Maximum execution time of 30 seconds exceeded in... (gives file and line number) I've cut out most of the code in the for loop to narrow it down specifically to the $i variable messing it up somehow... The 2nd part of the if statement where I try to append the $i variable to the new $row_ variable is causing it to hang like this. I want to be able to make 5 new variables... $row_1...$row_2...etc... but for some reason when I try to append the $i to it, it hangs. and throws that error eventually. Does anyone know why it does this? Any help is appreciated. Thank you! Code: [Select] for ($i=1;$i<6;$i++){ if ($row['bef_remarks' . $i] != "Description"){ $row_.$i = 1; I am using this code to check if a server is running. Everything is fine when it is running, but when it's not, it takes ages for the fsockopen to fail. Is there a way to make it timeout faster? if (fsockopen($settings->survival_server, $settings->survival_port, $timeout = 0.1)){ $survival['status'] = "Up"; }else { $survival['status'] = "Down" } Hey I was wondering is there a way I can set my sessions to timeout/end a session after a certain amount of time? Here is what I am dealing with... Code: [Select] <?php session_start(); $username = $_SESSION['username']; $userid = $_SESSION['userid']; ?> Okay, I'm not quite sure how to state this, so I'll try my best. Is it possible to setup some kind of "after so long, it changes the state of something from 'fighting' to 'active'" query or function? What is happening is that people are fighting in my game, then choosing to just close their browser rather than quit the fight, to change the monster's state back to 'active', so other people can fight it. Does that make sense? Any thoughts or could someone point me in the right direction? Thanks!! Ok i know the default session timeout is set to 25 minues if they are idle, but will it still timeout if the are on a constant refresh page, like if the page there on refreshes every minute will they time out after the 25 still. I have a site up where my client can enter some text in a box, and save it to a database. But the page keeps timing out. I can't get it to happen in Firefox or Chrome so it leads me to believe that its a setting in IE? This is what I have tried: set_time_limit(0) which i understand sets the time limit for the script to unlimitted. ini_set("session.use_cookies", 1); ini_set("session.cookie_lifetime", 0); Turns cookies on, and makes them last forever. session_set_cookie_params(3600); Makes a cookies last for an hour. I have tried all of these, and I have tried them in pieces. [EDIT] I do places all before session_start I have read that some of this can go into the php.ini file, but I cannot figure out exactly what I am supposed to add? Our current php.ini is the default: * CODE ******************************************************************************* register_globals = off allow_url_fopen = off expose_php = Off max_input_time = 60 variables_order = "EGPCS" extension_dir = ./ upload_tmp_dir = /tmp precision = 12 SMTP = relay-hosting.secureserver.net url_rewriter.tags = "a=href,area=href,frame=src,input=src,form=,fieldset=" ; Only uncomment zend optimizer lines if your application requires Zend Optimizer support ;[Zend] ;zend_optimizer.optimization_level=15 ;zend_extension_manager.optimizer=/usr/local/Zend/lib/Optimizer-3.3.3 ;zend_extension_manager.optimizer_ts=/usr/local/Zend/lib/Optimizer_TS-3.3.3 ;zend_extension=/usr/local/Zend/lib/Optimizer-3.3.3/ZendExtensionManager.so ;zend_extension_ts=/usr/local/Zend/lib/Optimizer_TS-3.3.3/ZendExtensionManager_TS.so ; -- Be very careful to not to disable a function which might be needed! ; -- Uncomment the following lines to increase the security of your PHP site. ;disable_functions = "highlight_file,ini_alter,ini_restore,openlog,passthru, ; phpinfo, exec, system, dl, fsockopen, set_time_limit, ; popen, proc_open, proc_nice,shell_exec,show_source,symlink" * CODE ******************************************************************************* Where do I go from here? What is my next step? Thanks for taking the time to read my post, and thanks for the help! Nick I was wanting to make some kind of session timeout ability, just purely out of curiosity. I have tried going off my own theory, before looking at any tutorials, well have looked a bit but tried getting the jist and having a go myself the next day (anyways that being said), this code he Code: [Select] <?php ini_set('display_errors', 1); session_name("jeremysmith_test_session"); session_start(); $_SESSION['start_time'] = time(); if($_SESSION['start_time'] < $_SESSION['start_time'] + 500) { printf("Time out be out dated!"); } Just does not seem to want to work, is there any reason for this, that you could think of? I appreciate any replies, Jeremy. Hi Im trying to keep a session open once a user logs in for a large amount of time ( months) I have set the following in a .htaccess file php_value session.cookie_lifetime 99999999 php_value session.gc_maxlifetime 99999999 But i still lose the session and get logged out after 10 minutes or so i have created a phpinfo page and see that for these values the following is set session.cookie_lifetime 99999999( local) 0 (global) session.gc_maxlifetime 99999999 ( local) 1440 (global) What is the difference between the local and global? Any other ides why might session keeps timing out after 10 mins or so of inactivity? Hi,
I have a heavy php page that is timing out at 30 seconds. I do not have access to change/set any ini or max_execution_time or the like. I need to find some creative workaround. First, I looked at async calls, but I'm using Yii 1.1 at the moment and it didn't look like simple to implement when I had a quick glance at that. Is it possible to load the content via ajax instead? Hi there. How do I stop a function from running it it's taking to long to proccess. For example, say after 10 seconds the code is still working I display an error message? As you can in the following code it's pulling the contents from an url or file which might take too long to load. function Tester($url); { return file_get_contents($url); } Thanks a mil Hi, I have created a script that connects to an FTP SERVER... downloads a movie file to the web server then uploads it to a target FTP server. If I move a small file of say 10mb it works well! But if I move a larger moviefile such as a 700mb file it doesn't... The only thing I can think is there is some sort of timeout... Any pointers would be a massive help! I have tried to increase the FTP timeout and also enabled passive mode Code: [Select] ftp_set_option($conn2, FTP_TIMEOUT_SEC, 600); Code: [Select] <?php $movefile = "movie 2.avi"; $server1 = array( "Name" =>"Downloads", "Host" =>"10.0.1.3", "User" =>"Paulio", "Pass" =>"lol", "Path" =>"/Download/Completed"); $server2 = array( "Name" =>"files", "Host" =>"10.0.1.2", "User" =>"Paulio", "Pass" =>"lol", "Path" =>"/Television"); ////////////////////////////////// Download file to be moved. ////////////////////////////////// // Connect to server $conn1 = ftp_connect($server1['Host']); // Open a session to an external ftp site $login1 = ftp_login ($conn1, $server1['User'], $server1['Pass']); ftp_pasv($conn1, true); // Check open if ((!$conn1) || (!$login1)) { echo "Ftp-connect failed!"; die; } else { echo "Connected to " . $server1['Name'] . " FTP server.<br><br>"; } [b]ftp_set_option($conn1, FTP_TIMEOUT_SEC, 600);[/b] ftp_chdir($conn1, $server1["Path"]); // Moves file to be moved to web NAS drive. ftp_get($conn1, $movefile, $movefile, FTP_BINARY); ////////////////////////////////// Upload file to be moved. ////////////////////////////////// // Connect to server $conn2 = ftp_connect($server2['Host']); // Open a session to an external ftp site $login2 = ftp_login ($conn2, $server2['User'], $server2['Pass']); ftp_pasv($conn2, true); // Check open if ((!$conn2) || (!$login2)) { echo "Ftp-connect failed!"; die; } else { echo "Connected to " . $server2['Name'] . " FTP server.<br><br>"; } [b]ftp_set_option($conn2, FTP_TIMEOUT_SEC, 600);[/b] ftp_chdir($conn2, $server2["Path"]); // Uploads moved file from web NAS drive to destination ftp_put($conn2, $movefile, $movefile, FTP_BINARY); //Deletes source file ftp_delete($conn1, $movefile); //Deletes temp file unlink($movefile); ftp_close($conn1); ftp_close($conn2); echo "Complete."; ?> Thanks, Paul. I have a page that you click to from your email to validate your account. Whenever you click that link and it goes to this page, the server connection times out. What is in this code that would make it timeout (it does not give an error, just says connecting...then times out) session_start(); include "../incl/connectdb.php"; $key1 = $_GET['id']; $key2= $_GET['id2']; $query = "select * from users where passkey = '$key1' and pass2 = '$key2' and activation= 'pending' LIMIT 1"; $result = mysql_query ($query) or die (mysql_error()); $row = mysql_fetch_array ($result); if (mysql_num_rows ($result) < 1){ $_SESSION['message'] = "Invalid link"; header ("Location: ../"); exit(); } $query = "update table-name set activation= 'active' where id = '".$row['id']."' "; mysql_query($query); $_SESSION['message'] = "Account validated."; As the title says, I have a .txt file with about 30,000+ lines of data presented in a pipe delimited list, which i'm parsing and inserting into my database. Problem is, my server seems to always time out every time I try to parse the whole file at once. I'm sure naturally it would work with out any timeout errors, but i'm ensuring the data is xss clean before it's being inserted and i'm doing that on about 15 items on each line, which means i'm calling the xss clean function over 450,000 times in one execution. So my friend suggested I break the files down maybe into each file having 5,000 lines of code, which would mean i'd generate about 6 files (if I had 30,000 lines of data). I've managed to code a script that breaks the main file into several files. Now what I want to do is pass each of those files to my parser method, but i'd like to do them one by one, rather than in one execution as I want to avoid the timeout error. Any ideas? |