PHP - Caching Files Cause Extra Server Hit.
EDIT. When starting this post, I thought it was causing the browser to make an extra request to the server. I've since found this wasn't the case, however, didn't change the title of this post, and can't seem to change it to something like "Critique of file caching script"
I am trying to cache a file, and put together the following script. I put the following in the browser:
https://test.sites.example.com/administrator/index.php?cid=2&controller=sell&id=643341356... and Apache will rewrite as: https://test.sites.example.com/index.php?admin=administrator&cid=2&controller=sell&id=643341356index.php includes the following line: <script src="/lib/js/clientConstants.php?b=1" type="text/javascript"> Similar TutorialsI have some fairly small text files (2K) which are parsed where certain deliminated fields are replaced with values provided in an associated array. There will typically be less than 5 replaced fields, and I plan on using preg_replace_callback to find and replace them.
I am contemplating caching the files after being parsed (note that they will only be accessed by PHP, and not directly by Apache).
Do you think doing so will provide any significant performance improvement?
If I do go this route, I would like thoughts on how to proceed. I am thinking something like the following:
Append the filename along with the serialized value of the array, and hash it using md5().
Store the parsed file using name "file_".$hash
Get the modification time of the newly created file using filemtime(), and store the value in a new file called "time_".$hash.
bla bla bla
When the next request comes in to parse a file, create the hash again.
If the file exists for the given hash name, and the time file matches filemtime(), use that file, else parse the original file.
Is this a good approach?
Hi folks, I'm curious if I can for example, save a file from my server and it will save to all other servers - obviously if they accepted the connection first. It's for a software I developed and is almost complete and know there will be frequent updates to it. Instead of users downloading upates, I want the update files from my server to somehow synchronize to their server automatically? Anything called this?? Thanks for info. What I'm trying to do is copy all files from one server to another folder on another server. Here is what I have have so far.. <?PHP //connection settings $ftp_server = "server"; $ftp_user_name = "user"; $ftp_user_pass = "pass"; $dir = "/var/test/"; $destination_file = "/test/"; // set up basic connection $conn_id = ftp_connect($ftp_server); // login with username and password $login_result = ftp_login($conn_id, $ftp_user_name, $ftp_user_pass); // check connection if ((!$conn_id) || (!$login_result)) { echo "FTP connection has failed!"; echo "Attempted to connect to $ftp_server for user $ftp_user_name"; exit; } else { echo "Connected to $ftp_server, for user $ftp_user_name"; } if (ftp_chdir($conn_id, $dir)) { echo " <br/>Current directory is now: " . ftp_pwd($conn_id) . "\n<p/>"; } else { echo "Couldn't change directory\n<p/>"; } $buff = ftp_rawlist($conn_id, $dir); foreach($buff as $files) { echo $files. "<br/>"; if (ftp_get($conn_id, $destination_file."test.file", $dir."test.txt", FTP_BINARY)) { echo "<br/>Successfully written to $destination_file\n"; } else { echo "There was a problem\n"; } ?> That doesn't work. Any ideas? Thanks, Sean Hey All, First post so please be gentle! I have a website www.antiquesattic.co.uk and i have been left in the lurch and without it being fully finished. I want to add google analytics code to everypage and i am a bit stuck. I have access to the server (apache 2.2.19) and have been able to add site maps etc but i am not sure where to paste the analytics code. Which folder would i look in and whereabouts do i paste. I would be extremely extremely grateful if someone could do me a quick step by step guide Also, if you would like to have a look at the website and give me some feedback that would be great! One final thing, i can't add content to the homepage just auctions, would it be a easy job to get some one to edit the site so i can add say 1000 words of text at the bottom of the homepage from my admin area? Apologies for the noob questions but my expertise is in sales and marketing not coding. Thanks Chris Just thinking last night trying to speed some things up, how hard is it to have a text file on the server that can only be edited by PHP? Or is there a way of making a folder unreadable to anyone?, this way I could have all the text files in the non-readable (non-viewable) folder so no user could edit them? James I have a php script that I wrote that enters information into a database and uploads images to the server. What I need to do now is delete the files from the server when the corresponding database records are deleted. How would I go about doing that? I set up functionality for a client that got moved to their server after I wrote and tested the functionality. It does not work on their server and I am trying to trouble shoot the problem as I do not have ftp access to the server.
I believe that I have narrowed it down to either I do not know how to access the temp files on their server, Windows IIS, OR the security on the server is not allowing me to upload to the temp files.
Here is the form used to upload to the server.
<form enctype="multipart/form-data" action="#" method="post" > <input type="hidden" name="upload_csv" value="TRUE" /> <p><label for="userfile">Upload CSV:</label> <input type="hidden" name="MAX_FILE_SIZE" value="45000000" /> <input type="file" id="userfile" name="userfile" title="userfile" value="" /></p> <p><input type="submit" name="upload_csv" value="Upload CSV" /></p> </form> Here is the test code: <?php $csvfile = $_FILES['userfile']['tmp_name']; $size = $_FILES['userfile']['size']; if($_POST['upload_csv']){ //---------------------------------validation code here if($problem){ echo '<div class="error">Did not pass validation</div>'; include("upload_csv.inc.php"); }else{ if (($handle = fopen($csvfile, "r")) !== FALSE) { //--------------------------------------------------------------this is where the code that imports the data from the csv and to the database goes echo '<div class="message"><p>CSV has been opened!</p></div>'; }elseif(!$csvfile){ echo '<div class="error">Problem #1</div>'; }else{ echo '<div class="error">Problem #2</div>'; } } }else{ include("upload_csv.inc.php"); } ?> This code generated the error message: Problem #2. One more bit of information: the permissions on the server is set so that nothing can be uploaded by an external script. I do not know much about servers, but it seems to me that because the security is so tight on the server, the security is the reason that this does not work...??? I appreciate help with this. I just am not knowledgeable on how the $_FILES['userfile']['tmp_name'] code actually works? Code: [Select] <?php exec('java -jar/MicroChatServer.jar',$output); print_r($output); ?> Hello frds......... I have chat prg. jar file and I have to integrate this chat in a PHP website uses Wampserver.... I try above code but it display "Array()" instead of execute jar file. What should I do???? Any idea?????????????// I'm trying to find a way to back up an entire servers file through PHP. I have a script that can take selected files into a zip file but I'm not sure how to make sure that when I loop through directory and files that I get every single file. I was thinking of just foreach(glob('*') as $file up to 10 times but I'm hoping there's something more definite then that method... Thanks i need to count the number of image files on a remote server not in my network. ive come across a suggestion to use
@getimagesize($img_url)the above works but is really slow. anyone able to suggest a better method?
i have php at server side and c++ at client side.what i am trying to do is to constantly look for files on server in folder on server if there’s a file in folder then download it on client side and delete it from server folder Hi, ive recently created a gallery website and im happy with the way everything currently works. However the main drawback is the site uploads using a html webfom which is great for remote users or the odd image. However, as i want to mass upload my existing collection i will need the ability to read a selected folder and then to carry out all the same processes that existed from the existing html form upload. Im also using gdlibrary and checking file types to ensure they are within my allowed list, but im wondering if there are any other common security alerts i should be aware of to keep things a little bit safer if/when i publish outside of my LAN. So in a nut shell i need some assistance with changing my upload process to work for more than one file at a time, ideally by reading a folder, or at least reading X amount of files at a time - processing them then moving onto next batch of files in the list. Then the next part i need help with is checking/improving basic security of the system We are using a FileZilla server for our upload ftp server and we encountered a problem in uploading files. When we upload files 1KB - 1.5MB it will be a success but when we upload a file higher than that an error says: "Filename cannot be empty in C:\wamp\www\gankgame\myaccount.php on line 53" Furthermore, when we upload a file much greater than that like 30MB+ it will not upload but has no errors. Is there any way to configure the upload maximum size? or to fix this errors ? Help us please Our upload code is uploaded . please see the attachment My server is Linux/Apache/PHP.
When a file is uploaded, I use PHP's finfo_open to confirm that the file have the correct file extension matches and delete them if it doesn't match. I also which file mimi types and size could be uploaded.
Things I do with the files include:
Upload user's files and store them in some public directory (/var/www/html/users_public_directory/), and allow other users to directly download them.
Upload user's files and store them in some private directory (/var/www/users_private_directory/), and allow other users to download them using X-Sendfile.
Upload user's ZIP files and convert them to PDF files (unzip the ZIP file, and uses Libreoffice and Imagemagick's convert to convert them to PDFs).
From the server's prospective, what are the risks of allowing users to upload files? Are there some file types which are more dangerous to the server? Could they be executed on the server, and if so, how could this be prevented?
I am developing a CMS for my clients where the login to administrate their website is through my own website. This unified back-end allows me to develop the functionality and user experience without the issue of updating each site manually with the new files or running a complicated macro etc and then uploading them all individually too. I would also like to combine as much of the "client side" CMS code too, i.e. as many of the php files that are stored on the clients website as possible. All of the websites (and the backend website) are located on the same VPS, running PHP V5.3, allow_url_include is not available, I could have it turned on but this causes security issues if I understand correctly? I figure my main options a - Turn on allow_url_include and just include via IP/Domain. - Use a FTP system similar to the way Wordpress allows remote install of plugins and system updates. or - Include the files using a different /home/username/cmsincludes/ path? Is it possible to use another VPS account to include from? I have tested and it gives me a "failed to open stream" error. Example: Client xyz Website Website located: /home/xyz/public_html/ CMS Included: /home/mainsite/cms/ Client 123 Website Website located: /home/123/public_html/ CMS Included: /home/mainsite/cms/ Client xxx Website Website located: /home/xxx/public_html/ CMS Included: /home/mainsite/cms/ Hey, friends. I have some trouble on the server front. My sites have been hacked, and I need to make sure I've eradicated every trace of this exploit. I'm looking for a way to search for any and all php files contained in multiple directories with specific names. For instance, I have found a commonality in relation to where these malicious files are placed, such as: Code: [Select] /some/dir/img/somename.phpor: Code: [Select] /some/dir/js/somename.php Is there a way I can easily (e.g. using ssh and the "find" command) locate all files ending in php but only found in directories named "img"? I can't seem to find anything that would allow me to do this with find, or with a combination of find and grep. I can't go directory by directory, as some of these img directories are created many levels deep, some even in .svn directories. Any and all help is appreciated. Hackers suck. Hello, I just had a quick question about caching when the compiler compiles php into assembly (or however this works with php). Anyways, is there any efficiency in doing this: $holder = strlen($anArray); for (i=0 to 100) echo $holder; Rather than this: for (i=0 to 100) echo strlen($anArray); ? If PHP is smart, then strlen($anArray) would be called once, and any subsequent calls to strlen($anArray) would just call a value stored in a cache. Is this how PHP works? Hi! I was little confused and i am not able to figure out how can i use caching server side and client side. How to use the caching using php. What I have, is a php page that runs over 60 query's a visit, and has over 2000 visits a day. That thousands of query's, and I'm sure this can be simplified easily to lessen the load. I only need to update the data on the page every 12 hours. So, what I'm thinking, is that it would be best to run the query based on time()(every 12 hours), and store that data in a .txt file. Then, the php file, instead of requesting the query over and over, it just extracts the data from the text file. Does this help me at all, or is it useless? is there a better method? Thanks! <?php $myData = file_get_contents(""); $myObject = json_decode($myData); $myObjectMap = $myObject->result; ?> can i somehow build in that it only request every 5 minutes because if there are many users on the site it request too much? |