PHP - Downloading Remote Files
I need to dwonload a remote file say http://abc.com/files/abc.zip.
Assuming that the file size > 100MB Is there a way to download the file in parts/chunks say 10MB each ? Similar TutorialsHello everyone. I have a problem with some of my php script. I have some php code that downloads a remote file and directly outputs it to the user. It works and all, it's just that when I do it on my Ubuntu LAMP server, downloading a file takes up a lot of ram. On my Windows WAMP server it works correctly and doesn't use up so much ram. Is this a problem with my code, or is it a server configuration? Below is the code I use to download the file. Code: [Select] public function output_file($file, $name, $mime_type='', $size) { session_write_close(); /* This function takes a path to a file to output ($file), the filename that the browser will see ($name) and the MIME type of the file ($mime_type, optional). If you want to do something on download abort/finish, register_shutdown_function('function_name'); */ if(is_readable($file)) die('File not found or inaccessible!'); /* $size = $size; $name = rawurldecode($name); */ /* Figure out the MIME type (if not specified) */ $known_mime_types=array( "pdf" => "application/pdf", "txt" => "text/plain", "html" => "text/html", "htm" => "text/html", "exe" => "application/octet-stream", "zip" => "application/zip", "doc" => "application/msword", "xls" => "application/vnd.ms-excel", "ppt" => "application/vnd.ms-powerpoint", "gif" => "image/gif", "png" => "image/png", "jpeg"=> "image/jpg", "jpg" => "image/jpg", "php" => "text/plain" ); if($mime_type==''){ $file_extension = strtolower(substr(strrchr($file,"."),1)); if(array_key_exists($file_extension, $known_mime_types)){ $mime_type=$known_mime_types[$file_extension]; } else { $mime_type="application/x-rar-compressed"; }; }; @ob_end_clean(); //turn off output buffering to decrease cpu usage // required for IE, otherwise Content-Disposition may be ignored if(ini_get('zlib.output_compression')) ini_set('zlib.output_compression', 'Off'); header('Content-Type: ' . $mime_type); header('Content-Disposition: attachment; filename="'.$name.'"'); header("Content-Transfer-Encoding: binary"); header('Accept-Ranges: bytes'); /* The three lines below basically make the download non-cacheable */ header("Cache-control: no-cache"); //header('Pragma: private'); // header("Expires: Mon, 26 Jul 1997 05:00:00 GMT"); // multipart-download and download resuming support if(isset($_SERVER['HTTP_RANGE'])) { list($a, $range) = explode("=",$_SERVER['HTTP_RANGE'],2); list($range) = explode(",",$range,2); list($range, $range_end) = explode("-", $range); $range=intval($range); if(!$range_end) { $range_end=$size-1; } else { $range_end=intval($range_end); } $new_length = $range_end-$range+1; header("HTTP/1.1 206 Partial Content"); header("Content-Length: $new_length"); header("Content-Range: bytes $range-$range_end/$size"); } else { $new_length=$size; header("Content-Length: ".$size); } /* output the file itself */ $chunksize = 1*(1024)*(1024); //you may want to change this $bytes_send = 0; if ($file = fopen($file, 'r')) { if(isset($_SERVER['HTTP_RANGE'])) fseek($file, $range); while(!feof($file) && (!connection_aborted()) && ($bytes_send<$new_length) ) { $buffer = fread($file, $chunksize); print($buffer); //echo($buffer); // is also possible flush(); $bytes_send += strlen($buffer); } fclose($file); } else die('Error - can not open file.'); die(); } // END OUTPUT_FILE function Hi, I have decent knowledge of htaccess and know enough about php/mysql so a little direction/advice is mostly what I need. I host a website(site1.com) that has links to files that are hosted on a different website(site2.com) on a different server. When a file link is clicked from site1.com, first it will go to a local php file(within site1.com) that checks that the user is logged in and make sure the file exists. Then it redirects(using a simple header function) to the file located at site2.com to begin download. But since my files at site2.com stored in a simple folder(if you knew the folder name, you could download all the files without having to be logged in), I want a way for site2.com to check that the file request is coming from site1.com(maybe using php referrer?) before it allows the download to proceed. Any ideas? Hi I am creating an e-commerce website, where users who pay with their credit card are able to download pdf books. how can I make a PDF file only accessible for those users (who pay and validate their credit card) and not for everyone ? I want to know the main idea about securing these files. Hi, I'm wondering if anyone knows of any good information regarding downloading files. I'm looking into creating a relatively simple download system, where people will be sent a link say (download.php?id=dnjwaAYFDAg734) (where id would be an encrypted value, and obviously a lot longer!). That script would then decode the id number and allow the user to download the appropriate file. Any help would be hugely grateful as I'm a little unsure of where to go and a quick look through google doesn't show too much up. Thanks Edd I have this code and its downloading file nicely. But I want to show a progress bar while downloading file from external server. My existing code shows progress bar but its not effective (ex. if i try to download two media file, one video and another audio, and video size is larger than audio size the audio finishes first and the progress bar shows 100% also suddenly it drops to 50% as the video is still downloading ). I mean it actually shows two progress and I need one. If I could get average percentage value of two progress that would be better. Any suggestion regarding this will be greatly appreciated. <?php set_time_limit ( 0 ); function define_progress_callback($i) { global $conn; curl_setopt($conn[$i], CURLOPT_PROGRESSFUNCTION, function ($resource,$download_size, $downloaded, $upload_size, $uploaded) { if($download_size > 0) $progress = round($downloaded / $download_size * 100); echo '<script language="javascript">$(".loader").loader("setProgress", '.$progress.');</script>'; echo str_pad("",1024," "); flush(); usleep(20000); }); } $urls = array("http://example.com/file.mp3", "http:/example.com/file.mp4"); $save_to='./tmp/'; $conn = array(); $fp = array(); $mh = curl_multi_init(); foreach ($urls as $i => $url) { $g = $save_to . basename($url); $conn[$i]=curl_init($url); $fp[$i]=fopen ($g, "wb"); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE); // No certificate curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE); curl_setopt ($conn[$i], CURLOPT_FILE, $fp[$i]); curl_setopt ($conn[$i], CURLOPT_HEADER ,0); curl_setopt($conn[$i],CURLOPT_CONNECTTIMEOUT,60); curl_setopt ($conn[$i], CURLOPT_MAXCONNECTS, 10); curl_setopt($conn[$i], CURLOPT_NOPROGRESS, false); define_progress_callback($i); curl_multi_add_handle ($mh,$conn[$i]); } do { $n=curl_multi_exec($mh,$active); } while ($active); foreach ($urls as $i => $url) { curl_multi_remove_handle($mh,$conn[$i]); curl_close($conn[$i]); fclose($fp[$i]); } curl_multi_close($mh); ?> Edited by requinix, 18 January 2015 - 08:16 AM. no code tags meant auto-embedding of mp3 I'm a newbie to PHP but I want to amend a website I run for a musician friend of mine so that when people purchase his albums (sold as ZIP files) they don't see the link to the actual file, but PHP generates a randomly named copy, then downloads it for them and then deletes the copy when complete. All works well except the download. With the large files the download goes to about 1/2 way through and then just stalls until it aborts. Here's the code... Code: [Select] <?php $path = "../_downloads/"; // change the path to fit your websites document structure $fullPath = $path.$_GET['file']; if ($fd = fopen ($fullPath, "r")) { $fsize = filesize($fullPath); $path_parts = pathinfo($fullPath); $ext = strtolower($path_parts["extension"]); switch ($ext) { case "zip": header("Content-type: application/zip"); // add here more headers for diff. extensions header("Content-Disposition: attachment; filename=\"".$path_parts["basename"]."\""); // use 'attachment' to force a download break; default; header("Content-type: application/octet-stream"); header("Content-Disposition: filename=\"".$path_parts["basename"]."\""); } header("Content-length: $fsize"); header("Cache-control: private"); //use this to open files directly stream_set_timeout($fd, 600); while(!feof($fd)) { $buffer = fgets($fd,8192); echo $buffer; } } $info = stream_get_meta_data($fd); fclose ($fd); if ($info['timed_out']) { echo 'Connection timed out!'; } else { echo 'Done'; } exit; // example: place this kind of link into the document where the file download is offered: // <a href="download.php?file=some_file.pdf">Download here</a> ?> The link to run it is below... www.godfreyb.com/_scripts/download.php?file=cd12_vg.zip I've tried the @readfile(); function and this does the same thing, get to about 60.5 Mb and stalls. I even tried passthru('cat ' . $fullPath); to send the file and that doesn't send anything at all. The site is hosted by FastHosts on a Windows 2008 server with PHP5 installed. Directly linking to the ZIP file results in a perfect download everytime. Can anyone suggest a way to get this working? It's driving me nuts! Hi guys, i have an array of data, with customers and their pics. something like: Quote array (customerid=>customerpic,...........); something like Quote 123412=>'https://mylink.com/mypic343545.tiff' 433453=>'https://mylink.com/mypic3434345.tiff' my goal is to: 1. go through the array and download all the pics, with names like customerid.tiff 2. when done renaming, make a zip file ON-THE-FLY 3. download the zip file to my PC. i know i can use file_put_contents and things like that, but i am not able to really connect the dots, especially the zipping of them all on the fly.....anyone has an idea on this? i need to count the number of image files on a remote server not in my network. ive come across a suggestion to use
@getimagesize($img_url)the above works but is really slow. anyone able to suggest a better method? Error: Code: [Select] No error, Im trying to Make it Download But it wont Download... Code: Code: [Select] <table border="2" cellpadding="2" cellspacing="2" width="100%"> <tr> <td>ID</td><td>NAME</td><td>TYPE</td><td>SIZE</td><td>DATE ADDED</td><td>DOWNLOAD</td> </tr> <?php //porperties $result = mysql_query("SELECT * FROM uploaded"); while($r=mysql_fetch_array($result)) { $link=$r["link"]; $name=$r["name"]; $type=$r["type"]; $size=$r["size"]; $date=$r["date"]; $id=$r["id"]; echo " <tr> <td>$id</td><td>$name</td><td>$type</td><td>$size</td><td>$date</td><td><a href='$link'>Download $name</a></td> </tr> "; } ?> </table> Hello, I have a php page from where anyone can download mp3 songs. my database table contain song title and the directory link of every song. the page retrieve song title with the link that people can click and download. but if u click the mp3 started to play on new window. if u right click and save as then it download. but I want that people click and it will start download or give a prompt to save the file in pc. I hv the following code please help me... thankx Code: [Select] <?PHP $artist=$_GET['artist']; $album=$_GET['album']; $query=("select *from Band where artist='$artist' and album='$album'"); $result=mysql_query($query); while ($row=mysql_fetch_array($result)) { ?> <a href="<?php echo $row["link"];?>"><strong><?php echo $row["title"];?></strong></a> <?PHP } ?> How can I make that if I click on the title then it will start download. pls need help. Hello I'm trying to make a php script that when run updates an image folder based on a photobucket feed. Here is my code so far. Code: [Select] <?php function download_feed($path) { $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $path); curl_setopt($ch, CURLOPT_FAILONERROR, 1); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_TIMEOUT, 15); $retValue = curl_exec($ch); curl_close($ch); return $retValue; } $sXML = download_feed('http://feed242.photobucket.com/albums/ff185/xivanari/account.rss'); $oXML = new SimpleXMLElement($sXML); foreach($oXML-> channel->item as $oDocuments) { $fileName = $oDocuments->title; $fileTmpLoc = $oDocuments->link; if(!file_exists("album1/".$fileName."/")) { if(!mkdir("album1/".$fileName)) $upload_location = "album1/".$fileName; $moveResult = move_uploaded_file($fileTmpLoc, $upload_location); } } ?> Alrighty, I have posted about this before, but for a very different question. So different, i figured it merited a different post. anyways... I have a php script that downloads a webpages html code, then shows that code as its own on another server. I got it to work great, but it does not download/show the images and css. Using <base href> works great for the links, but the images are an issue, along with the css. Here's what I have got so far: (any scripts or other stuff is greatly appreciated) <?php //download the webpage $contents = file_get_contents($url); //we have downloaded the html from the page, //but not the external files such as css, and images //so now we add the images and css //start by splitting it into an easy to manage array $contentchunks = explode("/n", $contents); //mess with the html we downloaded foreach($contentchunks as $varname => $varval){ //varval is the single line of html //go through every line of the html code and replace it to my heart's content if(stristr($varval, 'src=') === TRUE) { //check to see if has what im looking for //get the url to download //download and insert the image } } //put the modified array back together $contents = implode("/n", $contentchunks); //view the webpage echo $contents; ?> I have a demonstration of what it does over at http://dageek247.tk/viewit/ Hi, I want that user can only read my article but can't download it. neither with save page as or with view source, even not with selecting text. and copy and paste. is there any solution in php to do that. Please help Thanks Hello, Ive got a script that downloads a file to my server. I want to be able to download a file from a site such as FileServe using my premium details. Ive tried http://username:password@fileserve.com/file.rar but that dosent work, It just downloads a tiny small 30kb file. Is there a way i can set a cookie or somthing so my script has access to download file using my premium account? Thanks in advance Hello, I'm trying to use php to have a user upload a file and then, have my site create a link --- I'm ALMOST there. My last hitch is in creating the link. From the code below, Code: [Select] echo "this is the current filename: ". $supplemental_file; echo '<a href=../../supplemental_files/' . $supplemental_file . '">Supplementary Download</a>'; I get: this is the current filename: Complaint.doc But, when I hover over the link, I get: http://localhost:8888/algebra_book/supplemental_files/Complaint.doc%22 Any thoughts on the extra %22 ?? Thanks so much.... Hello There , I want to download a file from the server to the computer . How can i do this ?? i found out this code: header("Content-type: image/jpg"); header("Content-Disposition: attachment; filename=test.jpg"); the problem is , what i get is the file that the code was executed from saved in the .jpg format. how can i direct to a specified path to select a certain file then download it . example let's say i have a folder that contains 2 files: download.php and image.jpg how can i download the image.jpg from the server with the Save As window?? Thank You I wonder if some kind-hearted person can help me? I am trying to download a video file which has a wmv. The protocol is either mms:// or http:// I can see the video in my browser but I cannot download it using curl or wget in a program and I need to download a few of these. They are publicly viewable so they are not private stuff. The URL is http://210.150.12.140/vod11/tepco/other/1111_01.wmv?MSWMExt=.asf or mms://wmt.stream.co.jp/vod11/tepco/other/1111_01.wmv All I get at the moment is a text file of less than 1K and not the movie itself. How would I download this file please? Thanks Paul Hi, I'm trying to figure out the best way to determine if a user is currently downloading a file from my website. The way my site works, is the user waits 30 seconds and views an ad. After the timer is up the download becomes available. I heard that I could probably use a timestamp or something of the sort. I've read up on it but not sure how I could go about this. Hello,
I want the user to get authenticated before file download starts
Here is my code:
<?php if (!isset($_SERVER['PHP_AUTH_USER'])) { header('WWW-Authenticate: Basic realm="My Realm"'); header('HTTP/1.0 401 Unauthorized'); echo 'Your request is cancelled'; exit; } else { //check $_SERVER['PHP_AUTH_USER'] and $_SERVER['PHP_AUTH_PW'] if (isset ($valid)) { //start download $path = './data/negative_seq_60.txt'; $type = "text/plain"; header("Expires: 0"); header("Pragma: no-cache"); header('Cache-Control: no-store, no-cache, must-revalidate'); header('Cache-Control: pre-check=0, post-check=0, max-age=0'); header("Content-Description: File Transfer"); header("Content-Type: " . $type); header("Content-Length: " .(string)(filesize($path)) ); header('Content-Disposition: attachment; filename="'.basename($path).'"'); header("Content-Transfer-Encoding: binary\n"); readfile($path); // outputs the content of the file exit(); } else { //show error } } ?> But on clicking download link, I am getting the following error : " Undefined variable: valid at line no 9". Please help. I have dozens of tables that a client needs to download through the web. I can use PHP to convert the tables within the database to a CSV however I have to split the files into 10k result chunks otherwise the server gets overloaded with memory. I already raised the memory limit within the script that is downloaded (ini_set('memory_limit', '512M') which is able to make this happen. Downloading these files piece by piece is a really time consuming. Is there a memory safe and efficient way to combine all my tables as CSV's and into one zip them into one master download? PHPMyAdmin seems to do this smoothly, however, I need an online interface for the client to export the data directly... |