PHP - Caching A Php Array In Memory Across Different Requests.
Purpose: Building a search function for a site, that's supposed to be fast, give results as a user types. Queries would be something like: "brand1 brand2 brand3" . My idea is , instead of querying the database on each ajax request. A keyed array is created once. Like: ['brand1' => id , 'brand2 => id2 ]. This array is stored in memory. The next time a query is sent, the array which is instantly available in memory, and can be simply queried $storedArray['brand1']to fetch any existing ids instantly. The array size would be about 750 Kb. 60000 rows. I don't have much experience with caching , so looking for advise whether what I am trying to do even makes any sense or necessary. I know there are solutions like memcache. But not sure if my tiny project requires them. Also does opcache help here ? Would serializing the array be too slow ? Please ask if any questions. Thanks Similar TutorialsI was just wondering if when I write a new function that uses AJAX should I use the same http request for my whole website or should I make seperate ones for each function? Also if you use the same one do you need to do anything to it once it has been used to set the ready status back to 0? Thanks Hello there everyone. I'll try to explain as thorough as I can so please bare with me a bit. One you want to surf the web through a proxy in firefox, you go to tools > options > network > settings and enter proxy details which for example are like this. 173.123.123.4 and port 8080. I want to do pretty much the same thing with php for my visitors. I have www.site1.com which will have all my scripts and stuff. I want it to somehow redirect or load to www.site2.com USING A PROXY so that when the visitor reach site2, it's as if they have edited their firefox settings to view site2 using a proxy. That way visitors will always be anonymous on www.site2.com. While searching, I found this: http://stackoverflow.com/questions/3889715/php-requests-through-proxy Which seems simple enough but does not work unfortunately. Not that I'm even sure that it's indeed what I want to do but it seems like it...lol. Thanks a lot for any help provided. How can I track session requests, so I can, after a certain number of requests (let's say ten because it's physcologically pleasing), have the id regenerated? Hello! I'm trying to understand exactly how http requests relate to a php script. 1) What would be considered a large amount? 2) How can I see how many I have for a given page? Thank you, Eric I think it's fsockopen that enables you to do web requests right?? I was wondering if you could also make it use a proxy instead of your website's IP. Would that be possible? Hi, I am trying to make a web interface for a robot, I have written php to send/recieve values via a serial port to my robot. They work. I am now tring to develop my web interface. I'm using java to generate http requests client side in the form of; Code: [Select] /request?command=Forward¶m1=254 I was wondering how I can parse the command and param1 in php sereverside? Or is there a better alternative? I created an app for my website, set action (read) and object (article), and placed the objects code (META tags in the head) at the article page on my website. Now, I want to know how to send a cUrl request whenever a user reads an article on my website, so it'll feature on his wall. When I press the "get code" link near the action, that's what I get: Code: [Select] curl -F 'access_token=***' \ -F 'article=http://example.com' \ 'https://graph.facebook.com/me/yellowheart:read' (There's an actual access token of course). Now, how do I make it happen? Daniel I have an object which is very expensive to create, and is fairly large but by no means enormous. The object has two tasks: Display to the user what can be changed in a database. Make some or all of those changes based on user input.Instead of creating it first to perform the first task, I would like to serialize it and store it somewhere and then restore it to perform the second task to reduce user wait time. Communication of both tasks is as follows where the web client first makes a XMLHttpRequest and then cURL is used for the remaining: Web Client -> Web Server -> REST API Server -> Time Historian Application (and then back in the same order) In addition, both of these tasks take significantly longer than 30 seconds resulting in cURL error 28. I certainly can investigate to determine which of the requests are causing this error, however, feel that the solution to persisting the object might solve this issue as well. I am thinking of making the REST API server responsible for temporarily storing the object, and am considering the following: Web client makes XMLHttpRequest to web server and passes session cookie. Web server makes cURL request to REST API server and passes that same cookie (maybe a bad idea?). REST API server initiates the time historian application, spawns some new process, and replies to the web server maybe with some expected wait time duration, and web server in turn responds to web client. Spanned process when object is complete serializes the object's content and saves it as JSON using the session cookie as the filename. Web client periodically makes requests to web server which in turn make requests to REST API server and when the JSON file is available, recreates the object, executes the applicable method, and replies with the applicable content. Web client sends user data to the web server and in turn to the REST server to initiate the second task. Web client similarly periodically makes requests to web server which in turn make requests to REST API server to check if complete and if so the JSON object file is deleted. If request has not been fulfilled within 24 hours or so, JSON object file is deleted.I would appreciate any general feedback or recommendations how best to accomplish this, and whether using some 3rd party framework such as Gearman, ReactPHP, redis, etc might simplify matters. Thank you I have script on my web hosts server built into pages that will be offered to the public, eg: index.php will have some script amongst the html, this script calls other webpages on the net eg: wiki.org Now my question is, when that script runs when somebody accesses that page, will the website eg: wiki.org record the users browser info and ip who called my index page or will it record the webhosts server details as the one making the requests? Hi all, Something i forget to ask all the time, but now i don't. While reading again about image sprites, it tells that it's nice because it decreases the amount of http requests. Now I thought what about all those requires and includes of php aren't they doing the same thing? Not to mention OOP which would not exist without the 2(4). Does anyone know if this indeed increases the amount of http requests. And if so, if there is a certain good practise to lower the amount, by just combining functions in 1 file. I would love to hear from some guru's It's been a while since I've needed to whip anything substantial up from scratch, so my scripting is a little (lot) fast and loose (weird/inefficient) here. I'm trying to mock up a script that's essentially a quiz/survey. There are a handful of topics, each with a few screens of yes/no questions. At the end, it returns a list of recommendations based on the answers gathered. The script is posting back to itself. Using print_r ($_SESSION), it seems like all of the post values for the first screen of questions are being assigned to the session array as expected. When the second screen of questions is answered, their values are assigned as well, but the values for the first set go away completely. This continues through subsequent screens, with the values from the previous screen present and all others before missing. I'd really appreciate a look at my code to see if you tell me the cause or error(s). Thanks! <?php session_start; include('_config.php'); // database connect $dbc = mysqli_connect($CFG->dbhost, $CFG->dbuser, $CFG->dbpass, $CFG->dbname); // set to section 1, page 1 if no values are in _POST array if (($_SERVER['REQUEST_METHOD'] == 'GET') || (!isset($_POST['section']))) { $section = 1; $page = 1; } else { // something was posted, so...set those values in session variable foreach($_POST as $key => $data) { $_SESSION[$key] = $data; } // debug: display contents of the session array print_r ($_SESSION); // which section and page? $section = (int) $_POST['section']; $page = (int) $_POST['next']; } // check if last topic $query = "SELECT * FROM hw_topics"; $data = mysqli_query($dbc, $query); if ($section == mysqli_num_rows($data)) { $last_section = true; } else { $last_section = false; } // get current topic name and info $query = "SELECT topic, display_name, pages_in_topic FROM hw_topics WHERE topic_id = '$section'"; $data = mysqli_query($dbc, $query); if (mysqli_num_rows($data) == 1) { $row = mysqli_fetch_array($data); $topic_display_name = $row['display_name']; $pages_in_topic = $row['pages_in_topic']; } // test if last page in topic $topic_pages = $row['pages_in_topic']; if ($page == $topic_pages) { $last_page_in_section = true; } else { $last_page_in_section = false; } // set form action (set to this script or to recommendations when last section is complete if (($last_section == true) && ($last_page_in_section == true)) { $form_action = $CFG->reccomend; } else { $form_action = $_SERVER['PHP_SELF']; } // get current page headline $query = "SELECT page_headline FROM hw_pages WHERE topic_id = '$section' AND page_number = '$page'"; $data = mysqli_query($dbc, $query); if (mysqli_num_rows($data) == 1) { // The headline row was found so display the headline $row = mysqli_fetch_array($data); $page_headline = '<h2>' . $row['page_headline'] . '</h2>'; } // Grab the question data from the database to generate the list and form fields $query = "SELECT question_id, question_number, question_text FROM hw_questions WHERE topic_id = '$section' AND page_id = '$page' ORDER BY question_number"; $data = mysqli_query($dbc, $query); $questions = array(); while ($row = mysqli_fetch_array($data)) { array_push($questions, $row); } include($CFG->includesdir.'/header.php'); ?> <div id="head"> <h1>Assessment<?php if (isset($topic_display_name)) { echo ': <em>' . $topic_display_name . '</em>'; } ?></h1> <p class="paging">Page <?php echo $page; ?> of <?php echo $pages_in_topic; ?></p> </div><!-- #head --> <div id="content"> <p class="instr">Please complete this survey. We'll generate a list of recommendations and resources for your organization.</p> <div id="questions"> <?php echo $page_headline; ?> <form method="post" action="<?php echo $form_action; ?>"> <table border="0" cellpadding="0" cellspacing="0"> <thead> <tr> <td></td> <td class="qtext"></td> <td class="qanswer">yes</td> <td class="qanswer">no</td> <td class="pad"></td> </tr> </thead> <?php if ($questions) { // display question rows foreach ($questions as $question) { echo '<tr>'; echo '<td class="qnumber">' . $question['question_number'] . '.</td>'; echo '<td class="qtext"><p>...' . $question['question_text'] . '</p></td>'; echo '<td class="qanswer"><div class="radio" id="box-yes"><input type="radio" value="yes" name="qid_' . $question['question_id'] . '" id="qid_' . $question['question_id'] . '" class="radio" /></div></td>'; echo '<td class="qanswer"><div class="radio" id="box-no"><input type="radio" value="no" name="qid_' . $question['question_id'] . '" id="qid_' . $question['question_id'] . '" class="radio"'; $field_name = 'qid_' . $question['question_id']; if (isset($_SESSION[$field_name])) { echo ' checked="checked"'; } echo ' /></div></td>'; echo '<td class="pad"></td>'; echo '</tr>'; } } else { echo '<tr>'; echo '<td colspan="3" class="qtext"><p>No questions found in the database for this page.</p></td>'; echo '<td class="pad"></td>'; echo '</tr>'; } ?> </table> <ul id="controls"> <?php if ($last_page_in_section == true) { $section++; $page = 1; } else { $page++; } echo '<input type="hidden" value="' . $section . '" name="section" />'; echo '<input type="hidden" value="' . ($page) . '" name="next" />'; if (($last_section == true) && ($last_page_in_section == true)) { echo '<li><input type="submit" value="Submit Answers and Get Recommendations" name="submit" id="submit" /></li>'; } else { echo '<li><input type="submit" value="Next Page" name="submit" id="next" /></li>'; } ?> </ul><!-- #controls --> </form> </div><!-- #questions --> <?php mysqli_close($dbc); include($CFG->includesdir.'/footer.php'); ?> Hello, I just had a quick question about caching when the compiler compiles php into assembly (or however this works with php). Anyways, is there any efficiency in doing this: $holder = strlen($anArray); for (i=0 to 100) echo $holder; Rather than this: for (i=0 to 100) echo strlen($anArray); ? If PHP is smart, then strlen($anArray) would be called once, and any subsequent calls to strlen($anArray) would just call a value stored in a cache. Is this how PHP works? Hi! I was little confused and i am not able to figure out how can i use caching server side and client side. How to use the caching using php. Hi I have a php script which allows me to upload images to my product page, when I select a image to copy it simply does a Code: [Select] copy(); function, followed by a Code: [Select] header("Location: products.php"); redirect This all works ok, but when the page reload, the image has not changed, if I refresh the page, it seems to load ok. So I think this is a image caching problem. Any ideas on how to solve this? Thanks I am using a cache script, well it writes a array to a file like this: Code: [Select] fwrite($fh, '<?php'."\n\n".'define(\'PUN_LOTTERY_LOADED\', 1);'."\n\n".'$lottery = '.var_export($output2, true).';'."\n\n".'?>'); output2 is Code: [Select] $result2 = $db->query('MY QUERY '); $output2 = array(); while ($cur_donors = $db->fetch_assoc($result2)) $output2[] = $cur_donors; Now, I want to ditch the mysql and I want to use this script with 7 variables that I already have loaded, so I dont need to use the mysql, how do I add my 7 variables to my var_export function instead of using mysql to loop them? How can you prevent your browser from caching a page? I think that my news feed is being cached and is causing new posts to not be displayed even after a refresh. Eventually, the posts will show up after about 20 refreshes. I know my query is right so I didn't know if there was a way to stop page caching. I don't really know if it is a caching issue though. I'm starting to think I just have some weird bug in my code because sometimes on refresh news feed items get removed then i'll refresh again and some will get added back in then i'll refresh again and they'll all be there. Ever heard of this happening? Hi, just started experimenting with file caching after reading this article: http://www.rooftopsolutions.nl/blog/107 Aside from when dealing with data which changes frequently, I was wondering if it's ever inappropriate to cache your queries. For example, I have a page with a ridiculous number of queries, some of them have a resulting set which is not nearly as large as others. Page performance increased, just not to the level I had been expecting and I wondering if having Apache read each of these files for data is in fact faster than running a number of queries with only some data being cached. let's say I have 100,000 users on my forum i want to cache there PROFILE INFO (About me) in .php files which is easy/etc but would that be more space than 100k rows in a db table, or 100k .php files? just wondering, prob a stupid question but atm I cache some lottery info and some other stuff, but that's only 1 .php it would be dumb to cache info and have 100k .php's for each user ID right? or maybe story it all in 1 .php? would be a HUGE Filesize? rather just keep the data in MYSQL right? <?php $myData = file_get_contents(""); $myObject = json_decode($myData); $myObjectMap = $myObject->result; ?> can i somehow build in that it only request every 5 minutes because if there are many users on the site it request too much? What I have, is a php page that runs over 60 query's a visit, and has over 2000 visits a day. That thousands of query's, and I'm sure this can be simplified easily to lessen the load. I only need to update the data on the page every 12 hours. So, what I'm thinking, is that it would be best to run the query based on time()(every 12 hours), and store that data in a .txt file. Then, the php file, instead of requesting the query over and over, it just extracts the data from the text file. Does this help me at all, or is it useless? is there a better method? Thanks! |