PHP - Download With Speed Limit And Resume
I am currently using PHP 5.3.2 on IIS 7 and I am wondering how can I write a download script with resume and speed limit. After looking for a while, I found this script that works on speed limit:
Code: [Select] <?php $local_file = 'file.zip'; $download_file = 'name.zip'; // set the download rate limit (=> 20,5 kb/s) $download_rate = 20.5; if(file_exists($local_file) && is_file($local_file)) { header('Cache-control: private'); header('Content-Type: application/octet-stream'); header('Content-Length: '.filesize($local_file)); header('Content-Disposition: filename='.$download_file); flush(); $file = fopen($local_file, "r"); while(!feof($file)) { // send the current file part to the browser print fread($file, round($download_rate * 1024)); // flush the content to the browser flush(); // sleep one second sleep(1); } fclose($file);} else { die('Error: The file '.$local_file.' does not exist!'); } ?> Though there is a big problem that the download file comes without extension. For example if the original file is file.ext then the client will receive it as file without any extension. I just want to use octet-stream since I have no interest in streaming or other things, but I really want the client to have the extension also. So could you please tell me what is wrong with the above script, and please show me a way to do resume download. Thank you very much for your help. Similar TutorialsHi Chaps, I'm using readfile to force the download of a file: set_time_limit(0); $file = 'monkey.gif'; if (file_exists($file)) { header('Content-Description: File Transfer'); header('Content-Type: application/octet-stream'); header('Content-Disposition: attachment; filename='.basename($file)); header('Content-Transfer-Encoding: binary'); header('Expires: 0'); header('Cache-Control: must-revalidate, post-check=0, pre-check=0'); header('Pragma: public'); header('Content-Length: ' . filesize($file)); ob_clean(); flush(); readfile($file); exit; } flush(); And this works fine, however, I do have some software installation files that could be downloaded (these are in excess of 280Mb). I have checked php.ini: Quote memory_limit = 128M post_max_size = 300M But Internet Explorer hangs and then crashes. Is there a way to allow big files to download using this method, or is there another way of forcing the download, without php 'reading' the file first? I'm guessing that the problem lies with the memory_limit being smaller than the file size. Is it a good idea to increase the memory_limit to eg. 280Mb? Cheers I have searched this forum and the web, but I haven't been able to find a good answer. I'm interested in the best way (especially some working code snippet, hopefully!) to make a script stop when a limit (either number of rows or time limit or whatever) is reached, call itself and pass some variables to itself via sessions/cookies or $_REQUEST, and then resume where it left off the last time. Does anybody know a way to do this that works reliably in most (all?) environments? Any code that uses header(), output buffering, AJAX, or anything else, that is proven to work? What I want to do is make a generic script that accepts a (very large) local flat file as a data source, processes the contents and outputs the results either to another local flat file or to another application using the latter's routines. The result isn't necessarily going to be imported into MySQL (so no, LOAD DATA INFILE isn't applicable here). An example I can think of is processing a CSV file (say, about 100,000 records), converting it on-the-fly and importing the results into WordPress by calling WP's functions to do the actual import. Again, this is just an example. I can write all input, processing and output functions, it's just the best way to restart and resume the script that I'm asking about. BigDump is another script that does this, but it seems to use AJAX to restart itself. I wouldn't mind using AJAX/jQuery/whatever; in that case, I'd just need some working drop-in code, since my JavaScript/AJAX knowledge isn't very good, to say the least. Nevertheless, I'd rather prefer pure PHP, just in case the user has JavaScript disabled. Sorry for the long post and thanks in advance! Unless buffer overflows or breaking out of code to perform a new command are problems that have been solved.... I am trying to figure out the proper PHP method for setting a boundary on a variable within a script. I have this variable $name which is fed a value from $_POST['name'] from a form field. Now this form field is limited in the HTML to accept only 20 characters, but someone could easily edit the form or outgoing post data. So I want to know how to limit the variable size in the script. In other languages it could be something like this: var name(20). So how do I do that in PHP?
Basically I would like to place a link on my website and have the user download a file, but rather than just right clicking and choosing save target as, the link must be clicked, on the next page the file is fetched and then the client can download the file. How would I go about setting this up please? I have this php page that based on what the user chooses shows them the appropriate photo galleria and i was wondering if their was any way i could speed this up this is the php code for selecting the galleria and I'm not showing the javascript that does the slide show because it s the galleriffic jquery plugin Code: [Select] <div id="Info"> <div id="page"> <div id="container"> <?php $g = mysql_real_escape_string($_GET['g']); $query = "SELECT * FROM pinkpanther_games WHERE Gallery_no = '$g'"; $results = mysql_query($query) or die("Query failed ($_query) - " . mysql_error()); $row = mysql_fetch_assoc($results); echo "<h1>" . $row['Day_Played'] . " vs " . $row['Opponent'] . "</h1>"; ?> <!-- Start Advanced Gallery Html Containers --> <div id="gallery" class="content"> <div id="controls" class="controls"></div> <div class="slideshow-container"> <div id="loading" class="loader"></div> <div id="slideshow" class="slideshow"></div> </div> <div id="caption" class="caption-container"></div> </div> <div id="thumbs" class="navigation"> <ul class="thumbs noscript"> <?php $x = $row['no_Pics']; $y = 1; $year = $row['Year_Played']; $day = $row['Day_Played']; $scheck = $row['Sessions']; if ($scheck == 1){ $sessions = "Session1"; }else{ $sessions = "Session2"; } if ($x == 0){ echo "<li> <a class='thumb' href='../images/nopics.jpg' title=''><img src='../images/nopicsthumb.jpg ' /></a></li>"; }else if ($x == 10000){ echo "<li> <a class='thumb' href='../images/coming.jpg' title=''><img src='../images/comingthumb.jpg ' /></a></li>"; }else{ while ($y <= $x){ echo "<li> <a class='thumb' href='../images/Sections/pinkpanthers/" . $year . "/" . $sessions . "/" . $day . "/" . $y . ".jpg' title=''><img src='../images/Sections/pinkpanthers/" . $year . "/" . $sessions . "/" . $day . "/thumbs/" . $y . ".jpg ' /></a><div class='caption'><div class='download'><a href='../images/Sections/pinkpanthers/" . $year . "/" . $sessions . "/" . $day . "/big/" . $y . ".jpg ' target='_blank' />Download</a></div></div></li>"; $y ++ ; } } ?> </ul> </div> <div style="clear: both;"></div> </div> </div> <div id="footer"></div> Im going to use a large array of arrays, each of one having a lot of values and some sub arrays. My question is... is faster to use arrays or is better to have a object to acces using methods and all? i suppose objects are slower... Also i was planing in use arrays with string keys in nearly all places, normally these are slower, but in php hashes and arrays are the same tipe so i dont know... Hi everyone, I was discussing this topic with one of my friends and both of us can't give a real answer to this. Example: class test { function a(){ $obj = new DB_TableObject(); //blabla bla $this->b($obj); } function b($object){ $object->getResults(); } } class test { function a(){ $obj = new DB_TableObject(); //blabla bla $this->b($obj); } function b(){ $obj = new DB_TableObject(); $object->getResults(); } } Which one is better? The first or the second solution? this is pretty decent right? Code: [Select] mySQL_array SPEED TEST - [ 30,000 Numbers ] - SERIALIZED with BASE64 ENCODING AND GZ COMPRESSION 52.8KiB WRITE ARRAY in: 0.15 seconds. 52.8KiB READ ARRAY in: 0.03 seconds. 52.8KiB ROUND TRIP in: 0.18 seconds. (approx: 3.41 milliseconds per KiB) --- running on FIREFOX + EASY PHP 5.3.8.0 - ON ASUS G51VX - 2GHz dual core (x64) - 4GB RAM --- When pulling exif data from an image I have pretty much everythiung nailed down except for the Shutter Speed value. When pulled from the image this is returned as:- 'Shutter' => string '0.0015625' Now I know that the actual shutter speed at the time of shooting was 1/640 sec; does anyone have any ideas about how one would convert... 'Shutter' => string '0.0015625' to 1/640 sec... I have been scratching my head over this problem for a few days now and am stiull no wiser Complete beginner here so just really looking for pointers on where to start. Been doing a bit of bug fixing on one of our sites because the past 3 programmers we've hired have disappeared on us. The big issue I'm looking to solve is the load time of a search. Our website (Love-Rugs) is taking about 6 or 7 seconds to perform a blank search (basically a quick browse) whereas out other site (Little-Persia) takes about a second. It wouldn't be so bad if it was just the initial search but going from one page (only 10 products listed per page) to the next takes the same amount of time. There seems to be an awful lot of queries (around 130-150) on the searches - however - when using some of the search options e.g. type and fabric to refine the search the queries actually stay high but the time to process the results is reduced significantly. I don't really understand why if the queries are still high the search time is much lower unless it's to do with the number of results returned. However this does not explain why Little-Persia (which has far more products on it) takes less time as there would obviously be more results. I realize that without code this isn't easy to answer so I'm just looking to see if someone can point me in the right direction to look at just now. Are there any codes or php scripts that will calculate the user's internet speed? Iv'e been looking in to methods of scraping data from pages and has found several examples of using multi-curl to achieve this. But i am not used to curl and is not completely sure how it works and i need to find the fastest reliable (i do need all, or close to all, pages every run) method of getting the content of a number of pages (about 160). Here is an example i got from searching the web which i managed to implement: <?php /** * *@param $picsArr Array [0]=> [url], *@$picsArr Array will filled with the image data , you can use the data as you want or just save it in the next step. **/ function getAllPics(&$picsArr){ if(count($picsArr)<=0) return false; $hArr = array();//handle array foreach($picsArr as $k=>$pic){ $h = curl_init(); curl_setopt($h,CURLOPT_URL,$pic['url']); curl_setopt($h,CURLOPT_HEADER,0); curl_setopt($h,CURLOPT_RETURNTRANSFER,1);//return the image value array_push($hArr,$h); } $mh = curl_multi_init(); foreach($hArr as $k => $h) curl_multi_add_handle($mh,$h); $running = null; do{ curl_multi_exec($mh,$running); }while($running > 0); // get the result and save it in the result ARRAY foreach($hArr as $k => $h){ $picsArr[$k]['data'] = curl_multi_getcontent($h); } //close all the connections foreach($hArr as $k => $h){ $info = curl_getinfo($h); preg_match("/^image\/(.*)$/",$info['content_type'],$matches); echo $tail = $matches[1]; curl_multi_remove_handle($mh,$h); } curl_multi_close($mh); return true; } ?> Since time is critical in my script i would ask if you think this is a good implementation or if you can point me in the direction of one that will save me noticeable run-time. Since I got my websocket PHP server running nicely with my MYSQL, I can now have some fun
Attack speed is very simple, but I need your help with the unixtimestamp.
For example, There is a field name called "last_attack" and each time a user attacks a mob and a skill was performed; it will be updated with:
time();Then I disable the attack button for 2 seconds client side, but I also check that value against time() serverside as well. Now let's say the user's attack speed is 1.30% I want to make that Attack Speed check, to check it dynamically. It should now check only if the attack was less than 1.7 seconds ago instead of 2 seconds. How do I split up the unixtimestamp to work with percents? Edited by Monkuar, 03 December 2014 - 02:11 PM. Is there any php function I can test how fast my server is performing, like how fast its carrying out a certain action so I can see its resource usage. I am writing an OO PHP app to parse thousands of frames from a film, which will then each be manipulated and spat out using GD library funcs. Speed is of the essence. To improve speed, I have considered: - Another language!? - Use of PHP Command Line Interface? ( Working with web browser is surely a slow affair?) If PHP was a standard R6 Yamaha available to Joe Public, what would I have to do to win a GP race - you know - change gearing, alter timing, change exhaust system, rip out the air filter What does the panel suggest? Thanks B hello dear community, i am currently wroking on a approach to parse some sites that contain datas on Foundations in Switzerland with some details like goals, contact-E-Mail and the like,,, See http://www.foundationfinder.ch/ which has a dataset of 790 foundations. All the data are free to use - with no limitations copyrights on it. I have tried it with PHP Simple HTML DOM Parser - but , i have seen that it is difficult to get all necessary data -that is needed to get it up and running. Who is wanting to jump in and help in creating this scraper/parser. I love to hear from you. Please help me - to get up to speed with this approach? regards Dilbertone Okay I have a function that stores data into an Array. The function takes about 7 seconds to run and through a number of different loops it creates one final array with about 15,000 keys. I want to recall this data a number of times in different functions, however how can I have this data easily accessible without running the function each time. EX: Code: [Select] function theFunction() { for($x=0;$x,=15000;$x++) { //Run the loop and store data. $string[$x] = //output from other loops and calculations } return $string } //Then later on if I want to recall the data the only way I know how is to do the follow: $newstring = theFunction() The only problem I have with this is that it has to re-run the function every time in order to get to the data it spits out. How can I store this data into another array outside of the function without having to re-run it? I hope this makes sense. Thanks. I have made a Php program that downloads an Inno setup installation file for installing a program. However, if I for one or another reason want to make a new download of the same Inno setup installation file, the previous file will still be found in the Download folder. Each of the downloads get a number in parenthesis, setup(1), setup(2), setup(3) etc. However, I wondered if it is posible to erase the previous file in the same process as I download a new one, so that however many downloads I do, there will all the time only be one occurence of this file in the Download folder. The download code is as follows: $exe = "Inno script/Test_setup.exe"; header("Content-Type: application/octet-stream"); header("Content-Disposition: attachment; filename=\"Test_setup.exe\""); header("Content-Length: " . filesize($exe)); readfile($exe); Thanks in advance. Sincerely
I have a table called "playlists", and a table called "musics". In the musics table, there is a column playlist_id which references the playlist that each music is contained.
I'm using api calls to display information on the site with JavaScript, so I need to return a JSON.
I need the json with the following structu
[
Playlists: [
{
Name: "etc",
musics: [
{
name: "teste.mp3" }, { name: "test2.mp3" } ] }, ... ] ] And this is my code: $query = $con->prepare("SELECT * FROM playlists WHERE user_id = :id"); Hello,
I try to get website speed of some website, but i can read only ''domain.com'' i can't read website files like css , js ... why ? i use proxies for this job.
here is the php code:
$options = array( 'useragent' => "Firefox (+http://www.firefox.org)", // who am i 'connecttimeout' => 120, // timeout on connect 'timeout' => 120, // timeout on response 'redirect' => 10, // stop after 10 redirects 'referer' => "http://www.google.com", 'proxyhost' =>'85.25.8.14:80' ); $response = http_get("http://solve-ict.com/wp-content/themes/ict%20theme/js/jquery-1.7.1.min.js", $options , $info);but it works fine with http://domain.com/ , but with files css or js it gives 404, using some free proxy servers available ? Thanks. |