PHP - Avoiding Accidental Resubmission?
When user submits data through a textarea input and he then refreshes the page, the data will be re-submitted and inserted into the database a second time.
I used to avoid this with a header redirect, but this solution is not an option to me anymore, since I need to echo out in the header.php, which I have to put before the submission area. Another weak point with the header redirect solution is that the user could go a page back and then hit refresh. I'm wondering how are other sites avoiding resubmission? When you post a comment on YouTube it will simply freeze the submit button after you've posted, I'm guessing they have achieved that with JavaScript, is that solution recommended? And which type of other techniques are people generally using to avoid resubmission on page refresh? Similar TutorialsI am trying to build an app which will scan a site multple times, the only problem is the 403 error, how do I get around this. Searching seems to imply curl or user_agent, but can't get it working. Any suggestions? Thanks Excuse the beginner question. Though, how do you avoid the processing of HTML when text becomes inputted, without stripping away the tags, or trimming the text in any way, simply leaving it as is? If I have a web page located he www . company . com / how-to-repair-your-computer.html
And I decide to re-structure my website like this www . company . com / articles / how-to-repair-your-computer.html
How do I make sure that people don't search and end up at the old, now broken, link?
It seem inevitable that as a website grows, that you will want to re-organize things. What is the best way to make sure that anyone who searches or clicks on an old link - say from an email from a friend - doesn't get a 404 error?
Also, how do you avoid ruining a web pages rank on Google after you move things? (I think if the URL changes, Google makes you start all over as far as getting listed on page-1 and all of that?
Is this something I have to hande on my end, or is it a Google issue, or something else?
As the title says, I have a .txt file with about 30,000+ lines of data presented in a pipe delimited list, which i'm parsing and inserting into my database. Problem is, my server seems to always time out every time I try to parse the whole file at once. I'm sure naturally it would work with out any timeout errors, but i'm ensuring the data is xss clean before it's being inserted and i'm doing that on about 15 items on each line, which means i'm calling the xss clean function over 450,000 times in one execution. So my friend suggested I break the files down maybe into each file having 5,000 lines of code, which would mean i'd generate about 6 files (if I had 30,000 lines of data). I've managed to code a script that breaks the main file into several files. Now what I want to do is pass each of those files to my parser method, but i'd like to do them one by one, rather than in one execution as I want to avoid the timeout error. Any ideas? I've got a script where the client can upload pictures. The pictures are then resized, thumbnailed, and added to the database. In the process I'm trying to search the database for a duplicate file name and create a new name if necessary: Code: [Select] $userfile = 'userfile' . $i; $tmpLoc = $_FILES[$userfile]['tmp_name']; $name = $_FILES[$userfile]['name']; $error = $_FILES[$userfile]['error']; $type = $_FILES[$userfile]['type']; $temp = 'album' . $i; $album = $_POST[$temp]; if($error > 0) { echo "Error on $name: "; switch ($error) { case 1: echo "File exceeded upload_max_filesize"; break; case 2: echo "File exceeded max_file_size"; break; case 3: echo 'File only partially uploaded'; break; case 4: echo 'No file uploaded'; break; } echo "</div>"; exit; } // Check for name duplicates and deal with $query = "SELECT * FROM pictures WHERE src = $name"; $result = mysql_query($query); if($result) $dup = true; while($dup) { echo "Duplicate file name $name <br />"; $ext; if($type == 'image/gif') $ext = '.gif'; else if($type == 'image/jpeg') $ext = '.jpg'; else if($type == 'image/png') $ext = '.png'; else die("Error: Unsupported file type"); $x = 0; $name = $x . $ext; echo "Checking $name <br />"; $query = "SELECT * FROM pictures WHERE src = $name"; $result = mysql_query($query); if(!$result) { $dup = false; echo "File successfully renamed to $name to avoid duplicate <br />"; } $x++; } I don't get any errors of any sort, it just never enters the loop I have the following code:
$fp = fopen(“path_to_file”, ‘a’); flock($fp, LOCK_EX); fwrite($fp, $string); flock($fp, LOCK_UN); fclose($fp);If I try to lock the file in two different places at the same time, this will cause a race condition. How can I prevent this? I know in Java, for example, it has a concurrent library which contains reentrant lock, which basically tries to get the lock and if can't waits. What can I do in PHP? I have questions but I also have some good info to share about putting your software in the cloud. The situation: Some of you may have read about the nightmare stories. A developer had a infinite loop in his code that ran all night. This code did things inefficiently that devoured CPU in each iteration. This developer, greatly skilled, opened his email the next morning and saw a email bill from his cloud provider totaling for $75K. True story. Most cloud providers let you define CPU usage thresholds that, when breached, send you a warning but these thresholds, if I understand them, are per account. It would seem the claim that cloud resources are available in whatever amounts you need, CPU, disk space, enough RAM to never have to wait on a page faults, etc... the claim that the cloud provides you with infinitely elastic resources in an "all you can eat for one price" contract smells just a like a little like 💩 I did cloud development for many years with ServiceNow starting when it was a help desk and I watched it evolve into one of the best cloud development platforms out there. At one customer site I installed and managed it it out of the cloud and saw its insides and I can tell you its core code is not so terribly efficient. IMHO the cloud DOES take away 90% of a developer's worries about app performance.. If you call ServiceNow tech support and your problem is diagnosed as a performance issue with your code the first thing they will ask you is "did you follow the developer best practices"? They will politely say "sorry. Here's a link to them implying "fix your code". Questions: PHP Functions that devour CPU and where there is a better way? What PHP functions or code techniques waste CPU? I am using similar_text and it does the job but it is slow. Better way? What is the best way to measure CPU used by a PHP script or by a particular code module, defined as a set of related functions that fulfill a common purpose or by a single line of code? The purpose being to identify inefficient modules of code and improve them and even if the code is damn near perfect then at least can know what code modules are the most expensive. CPU killin users (and developers too), how can they be identified? I need to store data on cumulative CPU usage for any of the above and compare it with the free amount they give you and warn CPU hogs before they breach a threshold and generate $75K bills that were not in the plan. Any info you have on avoiding surprise $75K CPU bills from a cloud provider are welcome
It would seem the |