PHP - Moved: Smarty3 Caching
This topic has been moved to Other Libraries and Frameworks.
http://www.phpfreaks.com/forums/index.php?topic=322037.0 Similar TutorialsThis topic has been moved to Other. http://www.phpfreaks.com/forums/index.php?topic=345836.0 Hello, I just had a quick question about caching when the compiler compiles php into assembly (or however this works with php). Anyways, is there any efficiency in doing this: $holder = strlen($anArray); for (i=0 to 100) echo $holder; Rather than this: for (i=0 to 100) echo strlen($anArray); ? If PHP is smart, then strlen($anArray) would be called once, and any subsequent calls to strlen($anArray) would just call a value stored in a cache. Is this how PHP works? Hi! I was little confused and i am not able to figure out how can i use caching server side and client side. How to use the caching using php. let's say I have 100,000 users on my forum i want to cache there PROFILE INFO (About me) in .php files which is easy/etc but would that be more space than 100k rows in a db table, or 100k .php files? just wondering, prob a stupid question but atm I cache some lottery info and some other stuff, but that's only 1 .php it would be dumb to cache info and have 100k .php's for each user ID right? or maybe story it all in 1 .php? would be a HUGE Filesize? rather just keep the data in MYSQL right? Hi, just started experimenting with file caching after reading this article: http://www.rooftopsolutions.nl/blog/107 Aside from when dealing with data which changes frequently, I was wondering if it's ever inappropriate to cache your queries. For example, I have a page with a ridiculous number of queries, some of them have a resulting set which is not nearly as large as others. Page performance increased, just not to the level I had been expecting and I wondering if having Apache read each of these files for data is in fact faster than running a number of queries with only some data being cached. <?php $myData = file_get_contents(""); $myObject = json_decode($myData); $myObjectMap = $myObject->result; ?> can i somehow build in that it only request every 5 minutes because if there are many users on the site it request too much? How can you prevent your browser from caching a page? I think that my news feed is being cached and is causing new posts to not be displayed even after a refresh. Eventually, the posts will show up after about 20 refreshes. I know my query is right so I didn't know if there was a way to stop page caching. I don't really know if it is a caching issue though. I'm starting to think I just have some weird bug in my code because sometimes on refresh news feed items get removed then i'll refresh again and some will get added back in then i'll refresh again and they'll all be there. Ever heard of this happening? What I have, is a php page that runs over 60 query's a visit, and has over 2000 visits a day. That thousands of query's, and I'm sure this can be simplified easily to lessen the load. I only need to update the data on the page every 12 hours. So, what I'm thinking, is that it would be best to run the query based on time()(every 12 hours), and store that data in a .txt file. Then, the php file, instead of requesting the query over and over, it just extracts the data from the text file. Does this help me at all, or is it useless? is there a better method? Thanks! Hi I have a php script which allows me to upload images to my product page, when I select a image to copy it simply does a Code: [Select] copy(); function, followed by a Code: [Select] header("Location: products.php"); redirect This all works ok, but when the page reload, the image has not changed, if I refresh the page, it seems to load ok. So I think this is a image caching problem. Any ideas on how to solve this? Thanks I am using a cache script, well it writes a array to a file like this: Code: [Select] fwrite($fh, '<?php'."\n\n".'define(\'PUN_LOTTERY_LOADED\', 1);'."\n\n".'$lottery = '.var_export($output2, true).';'."\n\n".'?>'); output2 is Code: [Select] $result2 = $db->query('MY QUERY '); $output2 = array(); while ($cur_donors = $db->fetch_assoc($result2)) $output2[] = $cur_donors; Now, I want to ditch the mysql and I want to use this script with 7 variables that I already have loaded, so I dont need to use the mysql, how do I add my 7 variables to my var_export function instead of using mysql to loop them? I can trying to create a script for caching an XML files. The problem is that this XML file was poorly constructed and I can't do anything about it. XML file: http://www.boj.org.jm/uploads/tbills.xml I would like to have the cached xml file to be restructured like this: Code: [Select] <tbills> <tbill> <title>Announcement</title> <doc>www.boj.org.jm/pdf/tbill_press_release_2010-07-13.doc</doc> <date>Jul 14 2010 12:00am</date> </tbill> <tbill> <title>Results</title> <doc>www.boj.org.jm/pdf/tbill_results_2010-june-23.doc</doc> <date>Jun 23 2010 12:00am</date> </tbill> </tbill> I can find tutorials that works if the source XML data is properly structured, but I can never seem to find one that addresses situations like this. I found this code in a tutorial and tried to edit it to what I would need, but I'm stuck with 2 things. How do I get this to change the structure of the XML when caching How do I use the call script and where do I put it? Caching script (caching.php) Code: [Select] <?php /* * Caching A small PHP class to */ class Caching { var $filePath = ""; var $apiURI = ""; function __construct($filePath, $apiURI) { //check if the file path and api URI are specified, if not: break out of construct. if (strlen($filePath) > 0 && strlen($apiURI) > 0) { //set the local file path and api path $this->filePath = $filePath; $this->apiURI = $apiURI; //does the file need to be updated? if ($this->checkForRenewal()) { //get the data you need $xml = $this->getExternalInfo(); //save the data to your file $this->stripAndSaveFile($xml); } else { //no need to update the file return true; } } else { echo "No file path and / or api URI specified."; return false; } } function checkForRenewal() { //set the caching time (in seconds) $cachetime = (60 * 60 * 24); //one day worth of seconds //get the file time $filetimemod = filemtime($this->filePath) + $cachetime; //if the renewal date is smaller than now, return true; else false (no need for update) if ($filetimemod < time()) { return true; } else { return false; } } function getExternalInfo() { if ($xml = @simplexml_load_file($this->apiURI)) { return $xml; } else { return false; } } function stripAndSaveFile($xml) { //put the artists in an array $tbill = $xml->TBILLS->ANNOUCE; //building the xml object for SimpleXML $output = new SimpleXMLElement("<tbill><title></title></tbill>"); //get only the top 10 for ($i = 0; $i < 10; $i++) { //create a new artist $insert = $output->addChild("artist"); //insert name and playcount childs to the artist $insert->addChild("name", $artists[$i]->name); $insert->addChild("playcount", $artists[$i]->playcount); } //save the xml in the cache file_put_contents($this->filePath, $output->asXML()); } } ?> Calling script (calling.php) Code: [Select] <?php ini_set('display_errors', 1); error_reporting(E_ALL); include('caching.php'); $caching = new Caching($_SERVER['DOCUMENT_ROOT']."/tbills.xml", "http://www.boj.org.jm/uploads/tbills.xml"); ?> Thanks in advance. Best Regards, _________ Winchester I have a client, that wants an affiliate driven service, which is fine. However they want to offer the affiliates the ability to forward there own domains to the service and have that work as the initial affiliate id token. Now my question is. I know I can find what the domain is that the scripts I am writing run off of using $_SERVER['HTTP_HOST'] however I'm not to sure how that would work for a domain that is forwarded with a 301 or 302 redirect status and masked for use on the service I am building up. I want to say I could use $_SERVER['HTTP_REFERER']. But as I said I'm not to sure how thats going to work for a forwarded domain thats masked as its not part of the actual host conf files its landing on in the end. Hopefully I am making sense with the above. So what would be my best choice of options to work with when handling a domain that will be masked and landing on another domain as it will be forwarded. I am only taking the inital landing with a domain and setting the tokens I need set for it to run as that affiliate and setting them in sessions and cookie and various other variables. But I guess I am just wondering which would be better for me to catch that inital landing with the domain thats forwarded. Hi I just have a question about the caching code below.
Should the date function be in the filename if I want to cache the file for several days? Or if the date changes will a new file be created and used instead of the the previous cache file. Is that not even used since my if statement checks for the file modification time? Thanks for the help in understanding this.
$cachefile = 'cache/cached/'.$id . date('M-d-Y').'.php'; $cachetime = 172800; // Check if the cached file is still fresh. If it is, serve it up and exit. if (file_exists($cachefile) && time() - $cachetime < filemtime($cachefile)) { include($cachefile); exit; } ob_start(); {HTML TO CACHE} $fp = fopen($cachefile, 'w'); fwrite($fp, ob_get_contents()); fclose($fp); // finally send browser output ob_end_flush();I basically want to cache a file for 2-3 days. Hello, I am needing to cache a json response from an API so I am not making requests on each load. I have come across some code that is suppose to accomplish this but, it seems to not be working and can't figure out as to why. I created the cache file with 777 but it doesn't write or read from that file when I echo out the results.
The code below is just what is suppose to get the contents and cache it. At the end I tried to print it so I can test it to see it prints the response from the cache file but, nothing and the cache file does exists.
This is the first time I have tried something like this so please be gentle.
// cachePath is name of the path and file used to store cached current conditions gathered from the API request $cachePath = "./cache/nowcast-cache.txt"; // URL to the API request $url = "http://api.wunderground.com/api/XXXXXXXXXXXX/geolookup/conditions/q/TX/mesquite.json"; date_default_timezone_set('America/Chicago'); /** * Request jobs from API * * Split the request into smaller request chunks * and then consolidate them into a single array to limit the API * requests. */ function api_request() { file_put_contents($cachePath, file_get_contents($url)); } /** * API Request Caching * * Use server-side caching to store API request's as JSON at a set * interval, rather than each pageload. * * @arg Argument description and usage info */ function json_cached_api_results( $cache_file = NULL, $expires = NULL ) { global $request_type, $purge_cache, $limit_reached, $request_limit; if( !$cache_file ) $cache_file = dirname(__FILE__) . $cachePath; if( !$expires) $expires = time() - 2*60*60; if( !file_exists($cache_file) ) die("Cache file is missing: $cache_file"); // Check that the file is older than the expire time and that it's not empty if ( filectime($cache_file) < $expires || file_get_contents($cache_file) == '' || $purge_cache && intval($_SESSION['views']) <= $request_limit ) { // File is too old, refresh cache $api_results = api_request(); $json_results = json_encode($api_results); // Remove cache file on error to avoid writing wrong xml if ( $api_results && $json_results ) file_put_contents($cache_file, $json_results); else unlink($cache_file); } else { // Check for the number of purge cache requests to avoid abuse if( intval($_SESSION['views']) >= $request_limit ) $limit_reached = " <span class='error'>Request limit reached ($request_limit). Please try purging the cache later.</span>"; // Fetch cache $json_results = file_get_contents($cache_file); $request_type = 'JSON'; } return json_decode($json_results); } print_r($json_results); Edited by Texan78, 20 December 2014 - 03:51 PM. Purpose: Building a search function for a site, that's supposed to be fast, give results as a user types. Queries would be something like: "brand1 brand2 brand3" . My idea is , instead of querying the database on each ajax request. A keyed array is created once. Like: ['brand1' => id , 'brand2 => id2 ]. This array is stored in memory. The next time a query is sent, the array which is instantly available in memory, and can be simply queried $storedArray['brand1']to fetch any existing ids instantly. The array size would be about 750 Kb. 60000 rows. I don't have much experience with caching , so looking for advise whether what I am trying to do even makes any sense or necessary. I know there are solutions like memcache. But not sure if my tiny project requires them. Also does opcache help here ? Would serializing the array be too slow ? Please ask if any questions. Thanks I have some fairly small text files (2K) which are parsed where certain deliminated fields are replaced with values provided in an associated array. There will typically be less than 5 replaced fields, and I plan on using preg_replace_callback to find and replace them.
I am contemplating caching the files after being parsed (note that they will only be accessed by PHP, and not directly by Apache).
Do you think doing so will provide any significant performance improvement?
If I do go this route, I would like thoughts on how to proceed. I am thinking something like the following:
Append the filename along with the serialized value of the array, and hash it using md5().
Store the parsed file using name "file_".$hash
Get the modification time of the newly created file using filemtime(), and store the value in a new file called "time_".$hash.
bla bla bla
When the next request comes in to parse a file, create the hash again.
If the file exists for the given hash name, and the time file matches filemtime(), use that file, else parse the original file.
Is this a good approach?
EDIT. When starting this post, I thought it was causing the browser to make an extra request to the server. I've since found this wasn't the case, however, didn't change the title of this post, and can't seem to change it to something like "Critique of file caching script"
I am trying to cache a file, and put together the following script. I put the following in the browser:
https://test.sites.example.com/administrator/index.php?cid=2&controller=sell&id=643341356... and Apache will rewrite as: https://test.sites.example.com/index.php?admin=administrator&cid=2&controller=sell&id=643341356index.php includes the following line: <script src="/lib/js/clientConstants.php?b=1" type="text/javascript"> I am running a nginx web server. with this following config return 302 https://www.mysite.com/game_errorpage.php?sub=$name; The intent, is if someone types in "blah.mysite.com" and that subdomain (game server) is not running, I want them to be redirected to a error page. This error page then starts the game game server in question. This all works up to to this point. Then I want to redirect the user back to the original request url, to log into their game "blah.mysite.com" I use the following php header("Location: https://blah.mysite.com"); exit(); The issue I have is, because the browser has it cached already as https://www.mysite.com/game_errorpage.php?sub=$name it keeps returning there even though the game server is now running. The only thing I can do is close all the browsers and open a new window in Chrome using "Incognito" then type the url again. I have the following in my global file, for all pages. header("Cache-Control: no-store, no-cache, must-revalidate, max-age=0"); header("Cache-Control: post-check=0, pre-check=0", false); header("Pragma: no-cache"); So I am looking for some creative solutions. Thanks. Brad
Edited March 11, 2020 by Daxcor This topic has been moved to JavaScript Help. http://www.phpfreaks.com/forums/index.php?topic=356760.0 This topic has been moved to MySQL Help. http://www.phpfreaks.com/forums/index.php?topic=316254.0 |