PHP - Large File Download Issue
Hello,
I am working on a project that downloads large zip files from server, for small files the script works well and downlaod files successfully, but for larger files like currently we are trying to download a 922MB file it gives us this message (in firefox) and doesn't download any thing. " File not found Firefox can't find the file at http://www.domainname.com/abc.zip " Script to download the file is as below: " $filename = "xyz.mp3; header("Pragma: public"); header("Expires: 0"); header("Cache-Control: must-revalidate, post-check=0, pre-check=0"); header("Content-Type: application/force-download"); header("Content-Type: application/octet-stream"); header("Content-Type: application/download"); header("Content-Disposition: attachment; filename=".basename($filename).";"); header("Content-Transfer-Encoding: binary"); header("Content-Length: ".filesize($filename)); if( !ini_get('safe_mode') ) set_time_limit(360000000); readfile("$filename"); " Please advise what can be issue, if its file size issue then how and where can we increase the limit to solve this issue. pre-thanks, Similar TutorialsHi guys, I am currently receiving a large text file ( > 500mb), once per week which I have been manually splitting then processing to obtain the required CSV files. However, this is taking in the region of 2 to 3 hours. Very soon, these files will be sent daily and I really dont have the time to split and process this everyday I have been playing for a while to try and parse everything properly/automatically with fopen, feof and fgets ( and other 'f' options), but the script never seems to read the file all the way to the end - I assume this is due to memory usage. The data received in the file follows a strict pattern throughout the file which is: Code: [Select] BSNY990141112271112270100000 POO2C35 122354000 DMUS 075 O BX NTY LOLANCSTR 1132 11322 TB LIMORCMSJ 1135 00000000 LICRNFNJN 1140 00000000 H LICRNF 1141H1142H 11421142 T LISDAL 1147H1148H 11481148 T LIARNSIDE 1152H1153 11531153 T LIGOVS 1158 1159 11581159 T LIKTBK 1202 1202H 12021202 T LICARK 1206 1207 12061207 T LIULVRSTN 1214H1215H 12151215 T LIDALTON 1223 1223H 12231223 T LIDALTONJ 1225 00000000 LIROOSE 1229 1229H 12291229 T 2 LTBAROW 1237 12391 TF That is just one record of informaton (1 of around 140,000 records), each record has no fixed amount of lines but each line in each record is fixed to 80 characters and all lines in each record need to have the same unique 'id', at present, Im using an md5 hash of microtime. The first line of every record starts with 'BS' and the last line of each record starts with 'LT' terminating with 'TF'. All the other stuff between also follows a certain pattern of which I can break down effectively. The record above show one train service schedule, hence why each line in each record needs the same unique id. Anyone got any ideas on how I could process such a file effectively?? Many thanks Dave Hello, Im trying to find a way to check around 500-600 links to check if they are alive. It works fine for 5-6 links but once i add more links it just times out. Is there a way i could process this so it does 1 link at a time or somthing ? <?php include("config.php"); $query = "SELECT * FROM `games` WHERE `r_fileserve` <> \"\" LIMIT 500"; $result = mysql_query($query); while($row=mysql_fetch_assoc($result)) { $link_str = file_get_contents("$row[r_fileserve]"); $pattern = '<input type="hidden" name="download" value="normal"/>'; preg_match($pattern,$link_str,$match); if ($match[0] != null) { echo "Working <br />"; } else { echo "File Down <br />"; } } ?> Hello I already know this is possible with php <?php $file = 'monkey.gif'; if (file_exists($file)) { header('Content-Description: File Transfer'); header('Content-Type: application/octet-stream'); header('Content-Disposition: attachment; filename='.basename($file)); header('Content-Transfer-Encoding: binary'); header('Expires: 0'); header('Cache-Control: must-revalidate, post-check=0, pre-check=0'); header('Pragma: public'); header('Content-Length: ' . filesize($file)); ob_clean(); flush(); readfile($file); exit; } ?> But I am wondering is it possible to just send over a string of data stored in like $data and create a file clientside (nothing server side). So i'm guessing readfile() has to be replaced with something Dear PHPFreak members, I have been searching a solution for serving a file download via my website. The file to be downloaded is actually on a remote server. What i need is a code that serves as a download medium without actually downloading the file to my web server. Something like masking file url Can anyone help me with it ??? Have been trying this since days. But no solution till date. When I try to upload a file larger than the server's max limit, the following code is not executed. How am I supposed to inform the user that their file is too large? NOTE: I've stripped the code down for this post. Code: [Select] <?php if(isset($_POST['submit'])) { echo "test.."; } ?> <html> <head> <title>Upload Test</title> </head> <body> <form action='' enctype='multipart/form-data' method='POST'> <input type='file' name='file_upload' /> <input type='submit' name='submit' value='upload' /> </form> </body> </html> Hello. My script is set to upload files upto 5GB large. For that script I've currently set memory_limit to 5GB. Is it alright? I mean what is the ideal value (for large upload scripts) If you feel, 5GB is large. I can make script to upload 2GB files and set memory_limit accordingly. Also, max_execution_time has been set by me to 86400 currently. Assuming, on a 500Kbps broadband, it would require upto 24 hours to upload a 3-5GB file. Please suggest. Thank you. Going to try and explain this the best I can but I don't really have the best idea on what's happening here. I have a submission form for users to fill out their information and upload an image. I've set the file limit size at 500000 which I assumed would be safe for images at 400k or below. When testing locally, any image that is below that file size gets uploaded successfully. However, when testing on my online host/server.. the submission form and data is successfully entered but the image isn't saved at all. It obviously isn't over the size of the file limit I set because it dooesn't return an error.. it successfuly submits but doesn't save or resize my image. I really have no clue what the problem could be. I went over the variables I set for folder locations to move the image to and everything works fine locally, but once on the host and online, it doesn't happen. Hi I'm currently writing a script that basically downloads videos from a specific page. I am downloading with cURL however with some files, they're so large cURL is timing out. This is causing either a) PHP to timeout b) PHP memory to run out c) cURL to stop once defined timeout limit is reached This means that some files are only partitially downloaded as some files are over 100mb and some are only 20mb I have Code: [Select] set_time_limit(0);and Code: [Select] ini_set("memory_limit","500M");set but is there a way to make it so PHP will not timeout and the cURL session will not timeout until the file is downloaded? Hi I'm learning php and trying to write a script to extract registration information from a large text file. Sadly my meagre knowledge of php is letting me down a bit. It's a case of knowing what you want the script to do but not having the knowlege of how to 'say it'. So i was hoping that if I posted my code here someone could either give me a few pointers on where i am going wrong or suggest a better way. The text file data luckily has a recurring format as follows (for brevity i've only included one entry, which contains made up information): From: bella_done@yahoo.co.uk Sent: 02 February 2011 22:50 To: Jonny tum, patsy fells, dingly bongo Subject: Subject: Fun Run 2010 Categories: Fun Run Name: Bella Donna Address: 14 brondle avenue Postcode: cd83 1rg Phone: 0287343510 Email: bella_don@yahoo.co.uk DOB: 15/11/1945 Half or Full: Full fun run How did you hear: Took part in 2010 As you can see the data has a convenient boundary at the 'from' field and the colon (or so it occurred to me) so I created my script as follows: // the string being analysed $the_string = " From: bella_done@yahoo.co.uk Sent: 02 February 2011 22:50 To: Jonny tum, patsy fells, dingly bongo Subject: Subject: Fun Run 2010 Categories: Fun Run Name: Bella Donna Address: 14 brondle avenue Postcode: cd83 1rg Phone: 0287343510 Email: bella_don@yahoo.co.uk DOB: 15/11/1945 Half or Full: Full fun run How did you hear: Took part in 2010"; // remove all formatting to work with a clean string $clean_string = strip_tags($the_string); // remove form field entries from the data and replace with commas and a ZZZ boundary $remove_fields = array("Categories:" => "","Name:" => ",","Address:" => ",","Postcode:" => ",","Phone:" => ",","Email:" => ",","DOB:" => ",","Half or Full:" => ",","How did you hear:" => ",","From:" => "ZZZ","Sent:" => ",","To:" => ",", ); $new_string = strtr("$clean_string",$remove_fields); // split the data at the boundary ZZZ $string_to_array = explode("ZZZ", $new_string); $new_string2 = implode("</br>",$string_to_array); echo $new_string2; $myFile = "address_list.csv"; $fh = fopen($myFile, 'w') or die("can't open file"); $stringData = $new_string2; fwrite($fh, $stringData); fclose($fh); One major problem is when i write the new data to a csv file the csv contains spacings that cause it to be reproduced in a column form rather than as separate fields for each comma boundary. So can anyone suggest either a) a better way of extracting the data from the text file (doesn't need to be 100% clean and perfect) b) How can i stop the spaces in the csv (i thought i would have fixed this when i stripped the tags from the string at the start??). Any help would be greatly received by a newbie phper. It's my first shot at performing anything moderately taxing so if I've made some blaring oversites I would very much welcome your wisdom! Thank you Drongo Hello All, I have a simple upload form which I am using to upload files to Box.net using PHP Curl. It works fine for small files, but times out for larger files. Anyone have any suggestions for this? Thanks, Pete Here is the code: Code: [Select] <html> <body bgcolor="black"> <div align="center"> <img src="Homepage_02.jpg" border="0" /> <br> <br> <font color="#f1ca63"; font face="Arial"; font size="5">Upload</font> <br> <br> <?php if (isset($_POST['upload'])) { if (!empty($_FILES['new_file_1']['name'])) { $allowedExtensions = array("txt","csv","xml","css","doc", "docx","xls","xlsx","rtf","ppt","pdf","swf","flv","avi","wmv","mov","jpg","jpeg","gif","png"); foreach ($_FILES as $file) { if ($file['tmp_name'] > '') { if (!in_array(end(explode(".", strtolower($file['name']))),$allowedExtensions)) { echo $file['name'].' is an invalid file type!<br/>'; } else { $temp_name = $_FILES['new_file_1']['name']; $localfile = $_FILES['new_file_1']['tmp_name']; $file = fopen($localfile,'r'); $request_url = 'https://upload.box.net/api/1.0/upload/[Token Here]/[Folder ID]'; $post_params['check_name_conflict_folder_option'] = urlencode('1'); $post_params['new_file_1'] = "@$localfile"; $post_params['description'] = urlencode($_POST['description']); $post_params['uploader_email'] = urlencode($_POST['uploader_email']); $post_params['upload'] = urlencode('upload'); $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $request_url); curl_setopt($ch, CURLOPT_VERBOSE, 1); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "POST"); curl_setopt($ch, CURLOPT_POSTFIELDS, $post_params); curl_setopt($ch, CURLOPT_TIMEOUT, 300); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); $result = curl_exec($ch); curl_close($ch); $resultArray = explode(' ',$result); if($resultArray[5] != '') { $fileID = substr($resultArray[5],4,-1); $shareName = $temp_name; $link = 'http://www.box.net/shared/'.$shareName; } $renameurl = addslashes("https://www.box.net/api/1.0/rest?action=rename&api_key=[API KEY]&auth_token=[TOKEN Here]&target=file&target_id=".$fileID."&new_name=".$shareName); $renameResult = file_get_contents($renameurl); echo '<font color="white">Upload Successful</font>'; } } } } else { echo '<font color="white">Please select a file</font>'; } } ?> <hr width=600 color=grey> <br> <div align="center"> <form action="box_upload_curl.php" enctype="multipart/form-data" method="post"> <input type="hidden" name="check_name_conflict_folder_option" value="1"/> <table> <tr> <td class="field" style="color: #f1ca63; font-family: Arial; font-size: 14px" width="50%">Choose File to Upload: </td> <td class="input"><input type="file" name="new_file_1" /></td> </tr> <tr> <td class="field field_top" style="color: #f1ca63; font-family: Arial; font-size: 14px" ><br/> Description (optional):</td> <td class="input"><br/><textarea name="description"></textarea></td> </tr> <tr> <td class="field field_top" style="color:#f1ca63; font-family: Arial; font-size: 14px" > <br/> Your e-mail <font color="red">*</font>: </td> <td class="input field_top" style="color: #f1ca63; font-family: Arial; font-size: 14px" > <br/> <input type="text" name="uploader_email" id="email_input"></input> </td> </tr> <tr> <td colspan="2" class="submit" align="center"> <br /> <input type="submit" name="upload" value="Upload" /> </td> </tr> </table> </form> <hr width=600 color="grey"> </div> I'm trying to utilize a PHP script to parse a large XML file (around 450 MB) to MYSQL database into certain structure and definitions of included XML elements. The problem is that the original script uses file_get_contents and SimpleXMLElement to get it done, but the corn job executed by the server halts due to the volume of the XML file. I'm no PHP expert, so I bought this XMLSplit software and divided the XML into 17 separated XML files each at size of 30 MB, parsed them one by one using the same script. However, the output database was missing a lot of input, and I have serious doubts whether this would be the same output of the original file if left not divided automatically and parsed one by one.
So, I've decided to use XMLReader with this exact PHP script to parse this big XML file, but so far I couldn't manage to simply replace the parsing code and keep other functionality intact.
I'm including the script below, I'd really appreciate if someone helps me to do so.
<?php set_time_limit(0); ini_set('memory_limit', '1024M'); include_once('../db.php'); include_once(DOC_ROOT.'/include/func.php'); mysql_query("TRUNCATE screenshots_list"); mysql_query("TRUNCATE pages"); mysql_query("TRUNCATE page_screenshots"); // This is the part I need help with to change into XMLReader instead of utilized function, to enable parsing of the large XML file correctly (while keeping rest of the script code as is if possible): $xmlstr = file_get_contents('t_info.xml'); $xml = new SimpleXMLElement($xmlstr); foreach ($xml->template as $item) { //print_r($item); $sql = sprintf("REPLACE INTO templates SET id = %d, state = %d, price = %d, exc_price = %d, inserted_date = '%s', update_date = '%s', downloads = %d, type_id = %d, type_name = '%s', is_flash = %d, is_adult = %d, width = '%s', author_id = %d, author_nick = '%s', package_id = %d, is_full_site = %d, is_real_size = %d, keywords = '%s', sources = '%s', description = '%s', software_required = '%s'", $item->id, $item->state, $item->price, $item->exc_price, $item->inserted_date, $item->update_date, $item->downloads, $item->template_type->type_id, $item->template_type->type_name, $item->is_flash, $item->is_adult, $item->width, $item->author->author_id, $item->author->author_nick, $item->package->package_id, $item->is_full_site, $item->is_real_size, $item->keywords, $item->sources, $item->description, $item->software_required); //echo '<br>'.$sql; mysql_query($sql); //print_r($item->screenshots_list->screenshot); foreach ($item->screenshots_list->screenshot as $scr) { $main = (!empty($scr->main_preview)) ? 1 : 0; $small = (!empty($scr->small_preview)) ? 1 : 0; insert_data($item->id, 'screenshots_list', 0, $scr->uri, $scr->filemtime, $main, $small); } foreach ($item->styles->style as $st) { insert_data($item->id, 'styles', $st->style_id, $st->style_name); } foreach ($item->categories->category as $cat) { insert_data($item->id, 'categories', $cat->category_id, $cat->category_name); } foreach ($item->sources_available_list->source as $so) { insert_data($item->id, 'sources_available_list', $so->source_id, ''); } foreach ($item->software_required_list->software as $soft) { insert_data($item->id, 'software_required_list', $soft->software_id, ''); } //print_r($item->pages->page); if (!empty($item->pages->page)) { foreach ($item->pages->page as $p) { mysql_query(sprintf("REPLACE INTO pages SET tpl_id = %d, name = '%s', id = NULL ", $item->id, $p->name)); $page_id = mysql_insert_id(); if (!empty($p->screenshots->scr)) { foreach ($p->screenshots->scr as $psc) { $href = (!empty($psc->href)) ? (string)$psc->href : ''; mysql_query(sprintf("REPLACE INTO page_screenshots SET page_id = %d, description = '%s', uri = '%s', scr_type_id = %d, width = %d, height = %d, href = '%s'", $page_id, $psc->description, $psc->uri, $psc->scr_type_id, $psc->width, $psc->height, $href)); } } } }}?>I'd appreciate your help with that... I currently have a file management system using Zend framework. Everything has been working great with .pdf files in the 2-4 MB range (both on upload and download).
There was a .pdf file that was about 7MB in size, which uploaded fine. When selecting the file to download, it completes in about 2 seconds and only downloads about 300KB. I checked on the server and the file actually did upload fine and was not corrupted BUT I cannot get beyond this file download issue.
All other files (in 2-4 MB range) seem to download fine.
Would it be a memory size issue? Would anyone know the issue or how I can correct?
Thanks so much in advance
Hi all, i have trouble with a piece of code. HTML: Code: [Select] <div class="psdBox"><a href="downloads/001_PSD.php"><img src="webBuild/psdIcon.png" alt="psd icon" /></a> <div class="phpBox"> <?php $hit_count = @file_get_contents('http://example.com/downloads/001_PSD.txt'); echo $hit_count; ?></div></div> PHP: <?php $url = 'http://example.com/downloads/001_PSD.txt'; $hit_count = @file_get_contents($url); $hit_count++; @file_put_contents($url); header ('Location: http://example.com/downloads/001_PSD.zip'); ?> What i'm trying to do is to count the downloads on my site using PHP without sql. This code working from the root. [www.example.com/here] But if i want this code work from a folder [www.example.com/downloads/here] then i does not work. The 001_PSD.txt won't count up from a folder. In the root, where it works, the chmod of the 001_PSD.txt is standard 0644. This works! I always thought that i must be 0777, but from the root 0644 is ok. When i tested it from /downloads folder, i tried to set 001_PSD.txt to chmod 0777, but that also fails.. Can someone help me to make this work.. i cannot find the problem, i want to make multiple downloads and don't want to put every download in the root folder.. thank, javil Hi guys, I did read allot of documentation on the internet about reading/writing/parsing an XML file. I ended up using the following code, because I really have large files (some about 200MB) and regular dom does not work: while ($xml->read()) { switch ($xml->nodeType) { case (XMLReader::ELEMENT): if ($xml->localName == "job") { $node = $xml->expand(); $dom = new DomDocument(); $n = $dom->importNode($node,true); $dom->appendChild($n); $job = simplexml_import_dom($n); The problem I have is a special character error in the xml file, error returned on this line: "$node = $xml->expand();" I am literally banging my head to the wall to find a simple solution to this. I already have a cleaning function, but that can be applied only after the code above. As the file is large, to clean it, I would have to use the same code above to work on partial content at once, so I would have the same special character problem when I would try to read and split the file. I bet I am not the first one to be in this situation, but after about 5 hours of searching on the internet, I cannot do it no more. And I am not a php expert to come up with a new idea. One other thing to do probably would be to split the file into multiple files, and read them after that, without using the XMLReader. But this would ask for a different application. If, for example, on a file where I have an error, I do the reading with simplexml, without using the XMLReader, I don't get the error. But I cannot use simplexml on the files, since file size is variable. I have to use a reliable method that works for all situations. Hopefully someone has an idea to this STUPID situation! Thanks. I've written the following code: Code: [Select] <?php echo '<img src="Sheph.png" />'; function Wad(){ if (file_exists("Sheph.png")) { header('Content-Description: File Transfer'); header('Content-Type: application/octet-stream'); header('Content-Disposition: attachment; filename='.basename("Sheph.png")); header('Content-Transfer-Encoding: binary'); header('Expires: 0'); header('Cache-Control: must-revalidate'); header('Pragma: public'); header('Content-Length: ' . filesize("Sheph.png")); ob_clean(); flush(); readfile("Sheph.png"); exit; } } Wad(); ?> What I want to do is show the image AND show a file download window. The problem is that it shows the download window but the image doesn't appear. and if I comment the function and keep the echo part only,the image appears normally. What's wrong ? Hello! I'm making a interface in a website do manage files in a ftp hosted on other server. I'm using the function ftp_get() to download a file but I've tried different aways but I dosent seem do be able to download the file directly from the FTP server do the user the only way it works is if I download the file to the website server first then to the client. Is there any way that I can download the file directly to the user without have to hosting it permanently on the website server? Thank you Hi all, I'm trying to make a PHP script that downloads a file off a remote FTP server and serves the file to the visitor with a limited download speed. Right now I'm using fopen() like this: Code: [Select] $path = "ftp://".$username.":".$password."@".$server."/"; $fullPath = ($path.$fileName); $speed = 400; // 400 kb/s download rate header("Cache-control: private"); header("Content-Type: application/octet-stream"); //header("Content-Length: ".filesize($fullPath)); header("Content-Disposition: filename=\"".$fileName."\""); flush(); $fd = fopen($fullPath, "r"); while(!feof($fd)) { echo fread($fd, round($speed*1024)); flush(); sleep(1); } fclose ($fd); The file does download but with the wrong speed (8kb/s), does anyone know why? When I run the same code with a local file instead of a file on FTP it works fine and downloads at the speed I set it to.. Thanks The other day I noticed someone had post something like this, well this was actually the solution to the problem. But it got me thinking about a project of mine that I am working on and the need to pump out a downloadable file. We are storing most of the files in a database using base64 encoding the 2 key types of files we are storing are images and PDF's mostly pdf's anyway where I am wanting to go with this is, is there anyway to take the base64 encoded file and get it to download through this, or am I tackling the idea in the wrong way? <?php $filename = 'somename.txt'; //this would obviously be changed according to the file type we would output $data =<<<DATA I know my data would go here for the given file once decoded. DATA; header('Content-Description: File Transfer'); header('Content-Type: application/octet-stream'); header('Content-Disposition: attachment; filename=' . $filename); header('Content-Transfer-Encoding: binary'); header('Expires: 0'); header('Cache-Control: must-revalidate, post-check=0, pre-check=0'); header('Pragma: public'); header('Content-Length: ' . strlen($data)); echo $data; ?> I was able to upload the files to the DB but now am trying to retrieve it from the table but it's not working.
my code below
<?php while($row = mysql_fetch_array($query1)) { (list($id, $name) = mysql_fetch_array($query1)); echo"<tr>"; echo"<td> $row[category]</td>"; echo"<td> $row[description]</td>"; echo"<td> $row[amount]</td>"; ?> <td> <a href="approved_req.php?id=<?php echo urlencode($id);?>" ><?php echo urlencode($name);?></a> </td> <?php echo"<td> $row[status]</td>"; ?> <td><a class="btn btn-success" <?php echo" href='invoice.php?id=$row[id]'> Pay </a></td>";?></a> <?php echo"</tr>"; } // } ?> </table> <?php if(isset($_GET['id'])) { // if id is set then get the file with the id from database //$con = mysql_connect('localhost', 'root', '') or die(mysql_error()); //$db = mysql_select_db('test', $con); $id = $_GET['id']; $query = "SELECT name, type, size, content " . "FROM requisition WHERE id = '$id'"; $result = mysql_query($query) or die('Error, query failed'); list($name, $type, $size, $content) = mysql_fetch_array($result); header("Content-length: $size"); header("Content-type: $type"); header("Content-Disposition: attachment; filename=$name"); ob_clean(); flush(); echo $content; mysql_close(); exit; } ?>Thanks in advance Hi there, I have a cronjob for PHP which generates a table with dynamic data and this table is exported to excel when run. When i have one table it downloads one excel file. I want to download multiple excel files using a for loop. Please Help... Any suggestions will be helpful.. Thank you, Regards, Rohit |