PHP - Code Not Executed When File Too Large
When I try to upload a file larger than the server's max limit, the following code is not executed. How am I supposed to inform the user that their file is too large? NOTE: I've stripped the code down for this post.
Code: [Select] <?php if(isset($_POST['submit'])) { echo "test.."; } ?> <html> <head> <title>Upload Test</title> </head> <body> <form action='' enctype='multipart/form-data' method='POST'> <input type='file' name='file_upload' /> <input type='submit' name='submit' value='upload' /> </form> </body> </html> Similar TutorialsHello, I create tracking code file named "get_t.php". I call it using javascript: <script language='javascript' type='text/javascript'> var r=document.referrer; var counter = new Object(); counter.src = 'http://MyWebSite.com/get_t.php?aid=55&lid=F29&r='+r; </script><script src=http://MyWebSite.com/get_t.php?aid=55&lid=F29> </script> In this file all I do is to print "Hello World". Now, everything works fine most of the time. but sometimes the file does not executed for some reason, and empty page is displayed without showing message "Hello World". Please advice. Thanks, Hi, we are trying to make a marketing wheel. This means we got a database with lots of persons in it each with a individual email adres. The simel example of how the basic database looks is seen in the picture below. This table is called "Clicks" As you might notice the field "Aan" is either a 0 or a 1. This is how it works, when someone fills in a form on the website asking for more information the form with all the personal information of this person is send to a random email adres found in this table. When this happens the field "Aan" changes from 0 to 1 this way we can make sure that the same person does not get a email again until everyone else has been used 1 time. So in other words until that the field "Aan" is value 1 for every person in the table. As you might have guessed this is sadly enough not working ... this is our code : Code: [Select] include_once('connection.php'); $data = array(); $i = 0; $result1 = mysql_query("SELECT * FROM `Clicks` WHERE Aan = 0"); while ($list1 = mysql_fetch_array($result1,MYSQL_ASSOC)) { if (empty($list1) ){ mysql_query("UPDATE `Clicks` SET `Aan`= 0 WHERE `Aan` = 1"); } $result = mysql_query("SELECT * FROM `Clicks` WHERE Aan = 0 ORDER BY RAND() LIMIT 1"); while ($list = mysql_fetch_array($result,MYSQL_ASSOC)) { foreach ($list as $key => $value) { // Met htmlentities() voorkom je dat html wordt uitgevoert. $value = htmlentities($value); $data[$i][$keyl] = $value; } $i++; } } mysql_free_result($result); $data = array_reverse($data); // Zet de nieuwste berichten bovenaan // in plaats van onderaan. $cnt = count($data); for($i = 0; $i < $cnt; ++$i) { $bericht = $data[$i]; echo(nl2br($bericht['bericht'])); // Met nl2br() worden alle enters in het // bericht omgezet naar <br/>. echo('</td></tr></table>'); $naarwie= $bericht['Email']; }mysql_query("UPDATE `Clicks` SET `Aan`= 1 WHERE `Email` = $naarwie"); mysql_close($Verbinding); // Sluit de verbinding. // We hebben hem nu niet meer nodig. So this code in short explained would be. It checks if all the "Aan" fields are on 0 if the list is empty so it means they are all on 1 he will reset all the fields of every user to 0. and then he will take a random person out and send the email to that person. And after the mail is send will change that person his value on "Aan" to value 1. Now if the list is not empty, that means there are still persons with value 0 in the list. Then he will take a random user out of all the persons who have "Aan" with value 0 send the mail to that user and change his "Aan" value to 1. This process will be repeated every time someone asks for information until all values of all users are on 1. And then again he will change them to 0 for everyone. I hope its clear what we are looking for. And I hope someone can help us out. We are clueless... Hi guys, I am currently receiving a large text file ( > 500mb), once per week which I have been manually splitting then processing to obtain the required CSV files. However, this is taking in the region of 2 to 3 hours. Very soon, these files will be sent daily and I really dont have the time to split and process this everyday I have been playing for a while to try and parse everything properly/automatically with fopen, feof and fgets ( and other 'f' options), but the script never seems to read the file all the way to the end - I assume this is due to memory usage. The data received in the file follows a strict pattern throughout the file which is: Code: [Select] BSNY990141112271112270100000 POO2C35 122354000 DMUS 075 O BX NTY LOLANCSTR 1132 11322 TB LIMORCMSJ 1135 00000000 LICRNFNJN 1140 00000000 H LICRNF 1141H1142H 11421142 T LISDAL 1147H1148H 11481148 T LIARNSIDE 1152H1153 11531153 T LIGOVS 1158 1159 11581159 T LIKTBK 1202 1202H 12021202 T LICARK 1206 1207 12061207 T LIULVRSTN 1214H1215H 12151215 T LIDALTON 1223 1223H 12231223 T LIDALTONJ 1225 00000000 LIROOSE 1229 1229H 12291229 T 2 LTBAROW 1237 12391 TF That is just one record of informaton (1 of around 140,000 records), each record has no fixed amount of lines but each line in each record is fixed to 80 characters and all lines in each record need to have the same unique 'id', at present, Im using an md5 hash of microtime. The first line of every record starts with 'BS' and the last line of each record starts with 'LT' terminating with 'TF'. All the other stuff between also follows a certain pattern of which I can break down effectively. The record above show one train service schedule, hence why each line in each record needs the same unique id. Anyone got any ideas on how I could process such a file effectively?? Many thanks Dave Hello, Im trying to find a way to check around 500-600 links to check if they are alive. It works fine for 5-6 links but once i add more links it just times out. Is there a way i could process this so it does 1 link at a time or somthing ? <?php include("config.php"); $query = "SELECT * FROM `games` WHERE `r_fileserve` <> \"\" LIMIT 500"; $result = mysql_query($query); while($row=mysql_fetch_assoc($result)) { $link_str = file_get_contents("$row[r_fileserve]"); $pattern = '<input type="hidden" name="download" value="normal"/>'; preg_match($pattern,$link_str,$match); if ($match[0] != null) { echo "Working <br />"; } else { echo "File Down <br />"; } } ?> Hello, I am working on a project that downloads large zip files from server, for small files the script works well and downlaod files successfully, but for larger files like currently we are trying to download a 922MB file it gives us this message (in firefox) and doesn't download any thing. " File not found Firefox can't find the file at http://www.domainname.com/abc.zip " Script to download the file is as below: " $filename = "xyz.mp3; header("Pragma: public"); header("Expires: 0"); header("Cache-Control: must-revalidate, post-check=0, pre-check=0"); header("Content-Type: application/force-download"); header("Content-Type: application/octet-stream"); header("Content-Type: application/download"); header("Content-Disposition: attachment; filename=".basename($filename).";"); header("Content-Transfer-Encoding: binary"); header("Content-Length: ".filesize($filename)); if( !ini_get('safe_mode') ) set_time_limit(360000000); readfile("$filename"); " Please advise what can be issue, if its file size issue then how and where can we increase the limit to solve this issue. pre-thanks, Hello. My script is set to upload files upto 5GB large. For that script I've currently set memory_limit to 5GB. Is it alright? I mean what is the ideal value (for large upload scripts) If you feel, 5GB is large. I can make script to upload 2GB files and set memory_limit accordingly. Also, max_execution_time has been set by me to 86400 currently. Assuming, on a 500Kbps broadband, it would require upto 24 hours to upload a 3-5GB file. Please suggest. Thank you. Hi I'm learning php and trying to write a script to extract registration information from a large text file. Sadly my meagre knowledge of php is letting me down a bit. It's a case of knowing what you want the script to do but not having the knowlege of how to 'say it'. So i was hoping that if I posted my code here someone could either give me a few pointers on where i am going wrong or suggest a better way. The text file data luckily has a recurring format as follows (for brevity i've only included one entry, which contains made up information): From: bella_done@yahoo.co.uk Sent: 02 February 2011 22:50 To: Jonny tum, patsy fells, dingly bongo Subject: Subject: Fun Run 2010 Categories: Fun Run Name: Bella Donna Address: 14 brondle avenue Postcode: cd83 1rg Phone: 0287343510 Email: bella_don@yahoo.co.uk DOB: 15/11/1945 Half or Full: Full fun run How did you hear: Took part in 2010 As you can see the data has a convenient boundary at the 'from' field and the colon (or so it occurred to me) so I created my script as follows: // the string being analysed $the_string = " From: bella_done@yahoo.co.uk Sent: 02 February 2011 22:50 To: Jonny tum, patsy fells, dingly bongo Subject: Subject: Fun Run 2010 Categories: Fun Run Name: Bella Donna Address: 14 brondle avenue Postcode: cd83 1rg Phone: 0287343510 Email: bella_don@yahoo.co.uk DOB: 15/11/1945 Half or Full: Full fun run How did you hear: Took part in 2010"; // remove all formatting to work with a clean string $clean_string = strip_tags($the_string); // remove form field entries from the data and replace with commas and a ZZZ boundary $remove_fields = array("Categories:" => "","Name:" => ",","Address:" => ",","Postcode:" => ",","Phone:" => ",","Email:" => ",","DOB:" => ",","Half or Full:" => ",","How did you hear:" => ",","From:" => "ZZZ","Sent:" => ",","To:" => ",", ); $new_string = strtr("$clean_string",$remove_fields); // split the data at the boundary ZZZ $string_to_array = explode("ZZZ", $new_string); $new_string2 = implode("</br>",$string_to_array); echo $new_string2; $myFile = "address_list.csv"; $fh = fopen($myFile, 'w') or die("can't open file"); $stringData = $new_string2; fwrite($fh, $stringData); fclose($fh); One major problem is when i write the new data to a csv file the csv contains spacings that cause it to be reproduced in a column form rather than as separate fields for each comma boundary. So can anyone suggest either a) a better way of extracting the data from the text file (doesn't need to be 100% clean and perfect) b) How can i stop the spaces in the csv (i thought i would have fixed this when i stripped the tags from the string at the start??). Any help would be greatly received by a newbie phper. It's my first shot at performing anything moderately taxing so if I've made some blaring oversites I would very much welcome your wisdom! Thank you Drongo Going to try and explain this the best I can but I don't really have the best idea on what's happening here. I have a submission form for users to fill out their information and upload an image. I've set the file limit size at 500000 which I assumed would be safe for images at 400k or below. When testing locally, any image that is below that file size gets uploaded successfully. However, when testing on my online host/server.. the submission form and data is successfully entered but the image isn't saved at all. It obviously isn't over the size of the file limit I set because it dooesn't return an error.. it successfuly submits but doesn't save or resize my image. I really have no clue what the problem could be. I went over the variables I set for folder locations to move the image to and everything works fine locally, but once on the host and online, it doesn't happen. Hi I'm currently writing a script that basically downloads videos from a specific page. I am downloading with cURL however with some files, they're so large cURL is timing out. This is causing either a) PHP to timeout b) PHP memory to run out c) cURL to stop once defined timeout limit is reached This means that some files are only partitially downloaded as some files are over 100mb and some are only 20mb I have Code: [Select] set_time_limit(0);and Code: [Select] ini_set("memory_limit","500M");set but is there a way to make it so PHP will not timeout and the cURL session will not timeout until the file is downloaded? Hello All, I have a simple upload form which I am using to upload files to Box.net using PHP Curl. It works fine for small files, but times out for larger files. Anyone have any suggestions for this? Thanks, Pete Here is the code: Code: [Select] <html> <body bgcolor="black"> <div align="center"> <img src="Homepage_02.jpg" border="0" /> <br> <br> <font color="#f1ca63"; font face="Arial"; font size="5">Upload</font> <br> <br> <?php if (isset($_POST['upload'])) { if (!empty($_FILES['new_file_1']['name'])) { $allowedExtensions = array("txt","csv","xml","css","doc", "docx","xls","xlsx","rtf","ppt","pdf","swf","flv","avi","wmv","mov","jpg","jpeg","gif","png"); foreach ($_FILES as $file) { if ($file['tmp_name'] > '') { if (!in_array(end(explode(".", strtolower($file['name']))),$allowedExtensions)) { echo $file['name'].' is an invalid file type!<br/>'; } else { $temp_name = $_FILES['new_file_1']['name']; $localfile = $_FILES['new_file_1']['tmp_name']; $file = fopen($localfile,'r'); $request_url = 'https://upload.box.net/api/1.0/upload/[Token Here]/[Folder ID]'; $post_params['check_name_conflict_folder_option'] = urlencode('1'); $post_params['new_file_1'] = "@$localfile"; $post_params['description'] = urlencode($_POST['description']); $post_params['uploader_email'] = urlencode($_POST['uploader_email']); $post_params['upload'] = urlencode('upload'); $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $request_url); curl_setopt($ch, CURLOPT_VERBOSE, 1); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "POST"); curl_setopt($ch, CURLOPT_POSTFIELDS, $post_params); curl_setopt($ch, CURLOPT_TIMEOUT, 300); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); $result = curl_exec($ch); curl_close($ch); $resultArray = explode(' ',$result); if($resultArray[5] != '') { $fileID = substr($resultArray[5],4,-1); $shareName = $temp_name; $link = 'http://www.box.net/shared/'.$shareName; } $renameurl = addslashes("https://www.box.net/api/1.0/rest?action=rename&api_key=[API KEY]&auth_token=[TOKEN Here]&target=file&target_id=".$fileID."&new_name=".$shareName); $renameResult = file_get_contents($renameurl); echo '<font color="white">Upload Successful</font>'; } } } } else { echo '<font color="white">Please select a file</font>'; } } ?> <hr width=600 color=grey> <br> <div align="center"> <form action="box_upload_curl.php" enctype="multipart/form-data" method="post"> <input type="hidden" name="check_name_conflict_folder_option" value="1"/> <table> <tr> <td class="field" style="color: #f1ca63; font-family: Arial; font-size: 14px" width="50%">Choose File to Upload: </td> <td class="input"><input type="file" name="new_file_1" /></td> </tr> <tr> <td class="field field_top" style="color: #f1ca63; font-family: Arial; font-size: 14px" ><br/> Description (optional):</td> <td class="input"><br/><textarea name="description"></textarea></td> </tr> <tr> <td class="field field_top" style="color:#f1ca63; font-family: Arial; font-size: 14px" > <br/> Your e-mail <font color="red">*</font>: </td> <td class="input field_top" style="color: #f1ca63; font-family: Arial; font-size: 14px" > <br/> <input type="text" name="uploader_email" id="email_input"></input> </td> </tr> <tr> <td colspan="2" class="submit" align="center"> <br /> <input type="submit" name="upload" value="Upload" /> </td> </tr> </table> </form> <hr width=600 color="grey"> </div> I'm trying to utilize a PHP script to parse a large XML file (around 450 MB) to MYSQL database into certain structure and definitions of included XML elements. The problem is that the original script uses file_get_contents and SimpleXMLElement to get it done, but the corn job executed by the server halts due to the volume of the XML file. I'm no PHP expert, so I bought this XMLSplit software and divided the XML into 17 separated XML files each at size of 30 MB, parsed them one by one using the same script. However, the output database was missing a lot of input, and I have serious doubts whether this would be the same output of the original file if left not divided automatically and parsed one by one.
So, I've decided to use XMLReader with this exact PHP script to parse this big XML file, but so far I couldn't manage to simply replace the parsing code and keep other functionality intact.
I'm including the script below, I'd really appreciate if someone helps me to do so.
<?php set_time_limit(0); ini_set('memory_limit', '1024M'); include_once('../db.php'); include_once(DOC_ROOT.'/include/func.php'); mysql_query("TRUNCATE screenshots_list"); mysql_query("TRUNCATE pages"); mysql_query("TRUNCATE page_screenshots"); // This is the part I need help with to change into XMLReader instead of utilized function, to enable parsing of the large XML file correctly (while keeping rest of the script code as is if possible): $xmlstr = file_get_contents('t_info.xml'); $xml = new SimpleXMLElement($xmlstr); foreach ($xml->template as $item) { //print_r($item); $sql = sprintf("REPLACE INTO templates SET id = %d, state = %d, price = %d, exc_price = %d, inserted_date = '%s', update_date = '%s', downloads = %d, type_id = %d, type_name = '%s', is_flash = %d, is_adult = %d, width = '%s', author_id = %d, author_nick = '%s', package_id = %d, is_full_site = %d, is_real_size = %d, keywords = '%s', sources = '%s', description = '%s', software_required = '%s'", $item->id, $item->state, $item->price, $item->exc_price, $item->inserted_date, $item->update_date, $item->downloads, $item->template_type->type_id, $item->template_type->type_name, $item->is_flash, $item->is_adult, $item->width, $item->author->author_id, $item->author->author_nick, $item->package->package_id, $item->is_full_site, $item->is_real_size, $item->keywords, $item->sources, $item->description, $item->software_required); //echo '<br>'.$sql; mysql_query($sql); //print_r($item->screenshots_list->screenshot); foreach ($item->screenshots_list->screenshot as $scr) { $main = (!empty($scr->main_preview)) ? 1 : 0; $small = (!empty($scr->small_preview)) ? 1 : 0; insert_data($item->id, 'screenshots_list', 0, $scr->uri, $scr->filemtime, $main, $small); } foreach ($item->styles->style as $st) { insert_data($item->id, 'styles', $st->style_id, $st->style_name); } foreach ($item->categories->category as $cat) { insert_data($item->id, 'categories', $cat->category_id, $cat->category_name); } foreach ($item->sources_available_list->source as $so) { insert_data($item->id, 'sources_available_list', $so->source_id, ''); } foreach ($item->software_required_list->software as $soft) { insert_data($item->id, 'software_required_list', $soft->software_id, ''); } //print_r($item->pages->page); if (!empty($item->pages->page)) { foreach ($item->pages->page as $p) { mysql_query(sprintf("REPLACE INTO pages SET tpl_id = %d, name = '%s', id = NULL ", $item->id, $p->name)); $page_id = mysql_insert_id(); if (!empty($p->screenshots->scr)) { foreach ($p->screenshots->scr as $psc) { $href = (!empty($psc->href)) ? (string)$psc->href : ''; mysql_query(sprintf("REPLACE INTO page_screenshots SET page_id = %d, description = '%s', uri = '%s', scr_type_id = %d, width = %d, height = %d, href = '%s'", $page_id, $psc->description, $psc->uri, $psc->scr_type_id, $psc->width, $psc->height, $href)); } } } }}?>I'd appreciate your help with that... Hi guys, I did read allot of documentation on the internet about reading/writing/parsing an XML file. I ended up using the following code, because I really have large files (some about 200MB) and regular dom does not work: while ($xml->read()) { switch ($xml->nodeType) { case (XMLReader::ELEMENT): if ($xml->localName == "job") { $node = $xml->expand(); $dom = new DomDocument(); $n = $dom->importNode($node,true); $dom->appendChild($n); $job = simplexml_import_dom($n); The problem I have is a special character error in the xml file, error returned on this line: "$node = $xml->expand();" I am literally banging my head to the wall to find a simple solution to this. I already have a cleaning function, but that can be applied only after the code above. As the file is large, to clean it, I would have to use the same code above to work on partial content at once, so I would have the same special character problem when I would try to read and split the file. I bet I am not the first one to be in this situation, but after about 5 hours of searching on the internet, I cannot do it no more. And I am not a php expert to come up with a new idea. One other thing to do probably would be to split the file into multiple files, and read them after that, without using the XMLReader. But this would ask for a different application. If, for example, on a file where I have an error, I do the reading with simplexml, without using the XMLReader, I don't get the error. But I cannot use simplexml on the files, since file size is variable. I have to use a reliable method that works for all situations. Hopefully someone has an idea to this STUPID situation! Thanks. This topic has been moved to JavaScript Help. http://www.phpfreaks.com/forums/index.php?topic=355144.0 Hey guys, I'm a total newbie here, and just about as a new to php. My issue: I have a very large .html file that contain multiple articles (I actually have a few of these, but we'll start with one for practicality). The article titles are all wrapped in <h2> tags, there are 10 articles in one file. The articles are very simple, just a title wrapped with <h2> and then a few paragraphs wrapped in <p> tags. What I want to know how to do: I want to know if there's a way to open that file, and have each article saved as it's own .html or .txt document (the title & following paragraphs of each article). Ultimately taking my 1 large file, and creating the subsequent 10 smaller files from the articles inside of it. I am having trouble explaining this in text so I'll try to illustrate: I have "Articles.html" - which contains (article1,article2,article3.. ..article10) I want to split "Articles.html" and create "Article1.html", "Article2.html", "Article3.html", etc. Is that possible? Or am I looking at something far more complex than I can imagine at this point - perhaps something I'd be better off doing by hand? Ultimately I intend to stick all these articles into a database, but that's the 2nd part of what I want to do (and I think will be the easier of the tasks). Let me know if you need any additional information in the event my description above is unclear... I simply am having issues figuring out how to separate out the text into individual articles. I have a games website called GPStudios.com. In a previous topic (that remains unresolved) I needed help fixing a view counter on a specific page that was sometimes executing a mysql query twice. I've since noticed that it is happening on other pages, maybe all of them. At the top of the php "playgame.php", is the statement: $updatesql = "update games set timesplayed = timesplayed+1, last_played = now() where gameid = $gameid"; mysql_query($updatesql); However, when I reload the page or check on phpMyAdmin, it has sometimes incremented by 2 (possibly 3). I have confirmed that no where else is calling the same query. I created a new table called "FUCK" and editted the code stated above to: $updatesql = "update FUCK set timesplayed = timesplayed+1, last_played = now() where gameid = $gameid"; mysql_query($updatesql); Upon loading the page, it did exactly the same thing. So I confirmed that the PHP must be being run twice. I have tried it on other pages on my website such as I had no luck on my other topic of the same problem, but hopefully someone might be able to tell me why or how this might be happening. Just remember - it is an absolute certainty that the query is NOT being run elsewhere. Thanks. I have a function with a query and then a while loop and then an if file_exists with an image path. I am getting the image path displayed as text. Would this be because I am running the query through json encoding? so a user fills out a form, data gets sent to send.php then if successful redirects back to index.php, how on that index.php can i get it to say, sent successfully? Hi all I have this problem on a server using php5, unix based, safe_mode is On globally, i have turned it off locally through php.ini. Ok, this is testing example script i used: $cmd = ( "php -v" ); $out = shell_exec( $cmd ); print $out; On my own server this returns php version. On this mentioned server i'm using (commercial) this causes complete server breakdown, when logged in with SSH, i can't even issue "ls" command after that, nor find and kill the process. What could be so wrong with it? I don't think calling php-cli would make any difference. Hello all
I'm new to this forum. I've been struggling on this problem for a few days now.
I’m writing a script to automatically push code to a git server. exec(‘git config user.name "’ . $userName . ‘"’); exec(‘git config user.email "’ . $userEmail . ‘"’); exec('git checkout -b ’ . $branch); exec(‘git add --all’); exec(‘git commit -m "’ . $message . ‘"’); exec('git push origin ’ . $branch);
When running the last command, the script stops and asks for a user name, then a password. I tried other forums, searching the net. I'm frustrated...
Thanks in advance. |