PHP - How To Speed This Up
I have this php page that based on what the user chooses shows them the appropriate photo galleria and i was wondering if their was any way i could speed this up this is the php code for selecting the galleria and I'm not showing the javascript that does the slide show because it s the galleriffic jquery plugin
Code: [Select] <div id="Info"> <div id="page"> <div id="container"> <?php $g = mysql_real_escape_string($_GET['g']); $query = "SELECT * FROM pinkpanther_games WHERE Gallery_no = '$g'"; $results = mysql_query($query) or die("Query failed ($_query) - " . mysql_error()); $row = mysql_fetch_assoc($results); echo "<h1>" . $row['Day_Played'] . " vs " . $row['Opponent'] . "</h1>"; ?> <!-- Start Advanced Gallery Html Containers --> <div id="gallery" class="content"> <div id="controls" class="controls"></div> <div class="slideshow-container"> <div id="loading" class="loader"></div> <div id="slideshow" class="slideshow"></div> </div> <div id="caption" class="caption-container"></div> </div> <div id="thumbs" class="navigation"> <ul class="thumbs noscript"> <?php $x = $row['no_Pics']; $y = 1; $year = $row['Year_Played']; $day = $row['Day_Played']; $scheck = $row['Sessions']; if ($scheck == 1){ $sessions = "Session1"; }else{ $sessions = "Session2"; } if ($x == 0){ echo "<li> <a class='thumb' href='../images/nopics.jpg' title=''><img src='../images/nopicsthumb.jpg ' /></a></li>"; }else if ($x == 10000){ echo "<li> <a class='thumb' href='../images/coming.jpg' title=''><img src='../images/comingthumb.jpg ' /></a></li>"; }else{ while ($y <= $x){ echo "<li> <a class='thumb' href='../images/Sections/pinkpanthers/" . $year . "/" . $sessions . "/" . $day . "/" . $y . ".jpg' title=''><img src='../images/Sections/pinkpanthers/" . $year . "/" . $sessions . "/" . $day . "/thumbs/" . $y . ".jpg ' /></a><div class='caption'><div class='download'><a href='../images/Sections/pinkpanthers/" . $year . "/" . $sessions . "/" . $day . "/big/" . $y . ".jpg ' target='_blank' />Download</a></div></div></li>"; $y ++ ; } } ?> </ul> </div> <div style="clear: both;"></div> </div> </div> <div id="footer"></div> Similar TutorialsIm going to use a large array of arrays, each of one having a lot of values and some sub arrays. My question is... is faster to use arrays or is better to have a object to acces using methods and all? i suppose objects are slower... Also i was planing in use arrays with string keys in nearly all places, normally these are slower, but in php hashes and arrays are the same tipe so i dont know... Hi everyone, I was discussing this topic with one of my friends and both of us can't give a real answer to this. Example: class test { function a(){ $obj = new DB_TableObject(); //blabla bla $this->b($obj); } function b($object){ $object->getResults(); } } class test { function a(){ $obj = new DB_TableObject(); //blabla bla $this->b($obj); } function b(){ $obj = new DB_TableObject(); $object->getResults(); } } Which one is better? The first or the second solution? this is pretty decent right? Code: [Select] mySQL_array SPEED TEST - [ 30,000 Numbers ] - SERIALIZED with BASE64 ENCODING AND GZ COMPRESSION 52.8KiB WRITE ARRAY in: 0.15 seconds. 52.8KiB READ ARRAY in: 0.03 seconds. 52.8KiB ROUND TRIP in: 0.18 seconds. (approx: 3.41 milliseconds per KiB) --- running on FIREFOX + EASY PHP 5.3.8.0 - ON ASUS G51VX - 2GHz dual core (x64) - 4GB RAM --- When pulling exif data from an image I have pretty much everythiung nailed down except for the Shutter Speed value. When pulled from the image this is returned as:- 'Shutter' => string '0.0015625' Now I know that the actual shutter speed at the time of shooting was 1/640 sec; does anyone have any ideas about how one would convert... 'Shutter' => string '0.0015625' to 1/640 sec... I have been scratching my head over this problem for a few days now and am stiull no wiser Complete beginner here so just really looking for pointers on where to start. Been doing a bit of bug fixing on one of our sites because the past 3 programmers we've hired have disappeared on us. The big issue I'm looking to solve is the load time of a search. Our website (Love-Rugs) is taking about 6 or 7 seconds to perform a blank search (basically a quick browse) whereas out other site (Little-Persia) takes about a second. It wouldn't be so bad if it was just the initial search but going from one page (only 10 products listed per page) to the next takes the same amount of time. There seems to be an awful lot of queries (around 130-150) on the searches - however - when using some of the search options e.g. type and fabric to refine the search the queries actually stay high but the time to process the results is reduced significantly. I don't really understand why if the queries are still high the search time is much lower unless it's to do with the number of results returned. However this does not explain why Little-Persia (which has far more products on it) takes less time as there would obviously be more results. I realize that without code this isn't easy to answer so I'm just looking to see if someone can point me in the right direction to look at just now. I am currently using PHP 5.3.2 on IIS 7 and I am wondering how can I write a download script with resume and speed limit. After looking for a while, I found this script that works on speed limit: Code: [Select] <?php $local_file = 'file.zip'; $download_file = 'name.zip'; // set the download rate limit (=> 20,5 kb/s) $download_rate = 20.5; if(file_exists($local_file) && is_file($local_file)) { header('Cache-control: private'); header('Content-Type: application/octet-stream'); header('Content-Length: '.filesize($local_file)); header('Content-Disposition: filename='.$download_file); flush(); $file = fopen($local_file, "r"); while(!feof($file)) { // send the current file part to the browser print fread($file, round($download_rate * 1024)); // flush the content to the browser flush(); // sleep one second sleep(1); } fclose($file);} else { die('Error: The file '.$local_file.' does not exist!'); } ?> Though there is a big problem that the download file comes without extension. For example if the original file is file.ext then the client will receive it as file without any extension. I just want to use octet-stream since I have no interest in streaming or other things, but I really want the client to have the extension also. So could you please tell me what is wrong with the above script, and please show me a way to do resume download. Thank you very much for your help. Since I got my websocket PHP server running nicely with my MYSQL, I can now have some fun
Attack speed is very simple, but I need your help with the unixtimestamp.
For example, There is a field name called "last_attack" and each time a user attacks a mob and a skill was performed; it will be updated with:
time();Then I disable the attack button for 2 seconds client side, but I also check that value against time() serverside as well. Now let's say the user's attack speed is 1.30% I want to make that Attack Speed check, to check it dynamically. It should now check only if the attack was less than 1.7 seconds ago instead of 2 seconds. How do I split up the unixtimestamp to work with percents? Edited by Monkuar, 03 December 2014 - 02:11 PM. Are there any codes or php scripts that will calculate the user's internet speed? Iv'e been looking in to methods of scraping data from pages and has found several examples of using multi-curl to achieve this. But i am not used to curl and is not completely sure how it works and i need to find the fastest reliable (i do need all, or close to all, pages every run) method of getting the content of a number of pages (about 160). Here is an example i got from searching the web which i managed to implement: <?php /** * *@param $picsArr Array [0]=> [url], *@$picsArr Array will filled with the image data , you can use the data as you want or just save it in the next step. **/ function getAllPics(&$picsArr){ if(count($picsArr)<=0) return false; $hArr = array();//handle array foreach($picsArr as $k=>$pic){ $h = curl_init(); curl_setopt($h,CURLOPT_URL,$pic['url']); curl_setopt($h,CURLOPT_HEADER,0); curl_setopt($h,CURLOPT_RETURNTRANSFER,1);//return the image value array_push($hArr,$h); } $mh = curl_multi_init(); foreach($hArr as $k => $h) curl_multi_add_handle($mh,$h); $running = null; do{ curl_multi_exec($mh,$running); }while($running > 0); // get the result and save it in the result ARRAY foreach($hArr as $k => $h){ $picsArr[$k]['data'] = curl_multi_getcontent($h); } //close all the connections foreach($hArr as $k => $h){ $info = curl_getinfo($h); preg_match("/^image\/(.*)$/",$info['content_type'],$matches); echo $tail = $matches[1]; curl_multi_remove_handle($mh,$h); } curl_multi_close($mh); return true; } ?> Since time is critical in my script i would ask if you think this is a good implementation or if you can point me in the direction of one that will save me noticeable run-time. Is there any php function I can test how fast my server is performing, like how fast its carrying out a certain action so I can see its resource usage. Okay I have a function that stores data into an Array. The function takes about 7 seconds to run and through a number of different loops it creates one final array with about 15,000 keys. I want to recall this data a number of times in different functions, however how can I have this data easily accessible without running the function each time. EX: Code: [Select] function theFunction() { for($x=0;$x,=15000;$x++) { //Run the loop and store data. $string[$x] = //output from other loops and calculations } return $string } //Then later on if I want to recall the data the only way I know how is to do the follow: $newstring = theFunction() The only problem I have with this is that it has to re-run the function every time in order to get to the data it spits out. How can I store this data into another array outside of the function without having to re-run it? I hope this makes sense. Thanks. I am writing an OO PHP app to parse thousands of frames from a film, which will then each be manipulated and spat out using GD library funcs. Speed is of the essence. To improve speed, I have considered: - Another language!? - Use of PHP Command Line Interface? ( Working with web browser is surely a slow affair?) If PHP was a standard R6 Yamaha available to Joe Public, what would I have to do to win a GP race - you know - change gearing, alter timing, change exhaust system, rip out the air filter What does the panel suggest? Thanks B hello dear community, i am currently wroking on a approach to parse some sites that contain datas on Foundations in Switzerland with some details like goals, contact-E-Mail and the like,,, See http://www.foundationfinder.ch/ which has a dataset of 790 foundations. All the data are free to use - with no limitations copyrights on it. I have tried it with PHP Simple HTML DOM Parser - but , i have seen that it is difficult to get all necessary data -that is needed to get it up and running. Who is wanting to jump in and help in creating this scraper/parser. I love to hear from you. Please help me - to get up to speed with this approach? regards Dilbertone I have a table called "playlists", and a table called "musics". In the musics table, there is a column playlist_id which references the playlist that each music is contained.
I'm using api calls to display information on the site with JavaScript, so I need to return a JSON.
I need the json with the following structu
[
Playlists: [
{
Name: "etc",
musics: [
{
name: "teste.mp3" }, { name: "test2.mp3" } ] }, ... ] ] And this is my code: $query = $con->prepare("SELECT * FROM playlists WHERE user_id = :id"); Hello,
I try to get website speed of some website, but i can read only ''domain.com'' i can't read website files like css , js ... why ? i use proxies for this job.
here is the php code:
$options = array( 'useragent' => "Firefox (+http://www.firefox.org)", // who am i 'connecttimeout' => 120, // timeout on connect 'timeout' => 120, // timeout on response 'redirect' => 10, // stop after 10 redirects 'referer' => "http://www.google.com", 'proxyhost' =>'85.25.8.14:80' ); $response = http_get("http://solve-ict.com/wp-content/themes/ict%20theme/js/jquery-1.7.1.min.js", $options , $info);but it works fine with http://domain.com/ , but with files css or js it gives 404, using some free proxy servers available ? Thanks. |