PHP - Initializing Same Large Array For Many Web Pages
hello,
i have a large array (1000 to 2000 elements) where each element has a unique value. I need to reference this array in each web page. My plan was to assign all the array elements in a file then include it in each page. Seems like this would take up a lot of processing when it repeats the array assignment for each page. Is there a better way to do this in PHP? thanks in advance. Similar TutorialsI am getting an "undefined" error at these two lines... Code: [Select] <?php echo $errors['firstName']; ?> <?php echo $errors['lastName']; ?> Here is my file... Code: [Select] <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd"> <html> <head> <title></title> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"> <link type="text/css" rel="stylesheet" href=".css"> <style type="text/css" > form{ width: 400px; margin: 0 auto; } </style> </head> <body> <?php // Initialize. $errors = array(); $firstName = $lastName = ''; if (isset($_POST['submitted'])){ // Handle Form. // Trim all incoming data. $trimmed = array_map('trim', $_POST); // Check First Name. if (preg_match('/^[A-Z\'.-]{2,20}$/i', $_POST['firstName'])){ $firstName = $_POST['firstNamae']; }else{ $errors['firstName'] = 'Please enter your First Name.'; } // Check Last Name. if (preg_match('/^[A-Z\'.-]{2,20}$/i', $_POST['lastName'])){ $lastName = $_POST['lastName']; }else{ $errors['lastName'] = 'Please enter your Last Name.'; } // if there are errors then go back to the form and display them }else{ } ?> <form action=""> <fieldset> <legend>Billing Details</legend> <ol> <li> <label for="firstName">First Name:</label> <input id="firstName" name="firstName" class="text" type="text" /> <?php echo $errors['firstName']; ?> </li> <li> <label for="lastName">Last Name:</label> <input id="lastName" name="lastName" class="text" type="text" /> <?php echo $errors['lastName']; ?> </li> </ol> <input class="submit" type="submit" value="Process Order" /> </fieldset> </form> </body> </html> I thought things were initialized okay?! Sincerely, Debbie Hey everybody, I have a script that I think should be working, but it's not...go figure Here's the snippet that is causing and epic fail: <?php foreach($Array as $key => $value){ $$value = $key; } $checkVal = 'someValue'; $output = isset($$checkVal) ? TRUE : FALSE; ?> As you can see it basically sets the value of the array element to a var var and then checks agains an input word. If the input word matched the varName of a set variable, we can then assume that word was in the array and return TRUE. Pretty straight forward and I've tried about 3 different approaches to this, including: in_array and flipping and checking for isset(array['value']). The array that is being checked against is usually upwards of 15000 elements. I would appreciate any knowledge that helps understand any issues in searching large arrays and good ways to get around them, or if it's just an error in my coding/logic, let me know! Thank You all in advance. E Hello, I have a two dimension array of the following type id, phone, firstname, lastname row1:1, 2342, etc, etc row2:2, 2352, etc, etc The array has at least 100000 rows and i need to delete the rows where the phone is dublicate with other row(phone). i tried in_array and array_intersect but it seems to be very slow ( Any suggestions? [attachment deleted by admin] I have a list of IP addresses in an very very very large array in random order. I am trying to find a way to determine if there are at least 8 IP addresses that are consecutive. I was thinking some sort of for loop after a sort but that seems very time consuming and I'm sure PHP has a better way... Some searching online revealed this: http://bytes.com/topic/php/answers/12143-flagging-consecutive-numbers-data-set Is this the way to go? Any tips? Thanks! Carmen Hello, I have a social network that allows users to make comments and replies to comments. We have had a hacker attack us a couple of times in the passed couple of weeks via using our comment system manually, so I am trying to implement a limit on the amount of comments a user can make a day. this is what I have so far. The query before the insert query that counts based off the current date and than the query that inserts the comments. I'm just trying to figure out how to tie those together and determine what kind of if statement to do, and what to include in the if to make it work //QUERY TO FIND THE COUNT $query_count = "SELECT COUNT(*) FROM `CysticAirwaves` WHERE `FromUserID` = $auth->id AND date = `CURDATE()`"; $request = mysql_query($query_count,$connection); $result = mysql_fetch_array($request); //IF SHOULD GO HERE //INSERT THE COMMENT Code: [Select] $query = "INSERT INTO `CysticAirwaves` ( `FromUserID`, `ToUserID`, `comment`, `status`, `statusCommentAirwave`, `date`, `time` ) VALUES ( '" . $auth->id ."', '" . $prof->id ."', '" . mysql_real_escape_string($_POST['ProfileComment']) ."', 'active', 'active', '" . date("Y-m-d") . "', '" . date("G:i:s") . "')"; mysql_query($query,$connection); if($auth->id == $prof->id) { $just_inserted = mysql_insert_id(); $query = "UPDATE `CysticAirwaves` SET `status` = 'dead' WHERE `FromUserID` = '" . $auth->id . "' AND `ToUserID` = '" . $prof->id . "' AND `id` != '" . $just_inserted . "'"; $request = mysql_query($query,$connection); } I have the registration form blues! form 1 collects the classes chosen, and passes an array of classes chosen(on a previous page), in a hidden field: <input type="hidden" value="<?= $c_row['workshop_id'] ?>" name="wid[]" /> form 2 makes it a variable $wid = $_POST['wid']; I'm trying to pass it to form 3, using the hidden input again, this time as a variable: $wid = $_POST['wid']; <input type="hidden" value="<?= $wid ?>" name="wid[]" /> The next page is a printable page. But I can't do anything with the classes, because it's some how made an array of an array??? I wrote: $wid = $_POST['wid']; print_r($wid); This produces--- Array ( => Array ) I want it to give me the original array? Is there a different way to do this? Following is the form.Now i want to see if submit button is pressed or not.Usually if html had Code: [Select] <input type="submit" value="Submit" name = "submit" /> i would simply use Code: [Select] <?php if(isset($_POST['submit'])) ?> But below is the form which has an image as a submit button Form - Code: [Select] <form action="" method="post" id="sendemail"> <ol> <li> <label for="name">Name (required)</label> <input id="name" name="name" class="text" /> </li> <li> <label for="email">Email Address (required)</label> <input id="email" name="email" class="text" /> </li> <li> <label for="email">Phone number</label> <input id="phone" name="phone" class="text" /> </li> <li> <label for="address">Address(required)</label> <input id="address" name="address" class="text" /> </li> <li> <label for="city">City(required)</label> <input id="city" name="city" class="text" /> </li> <li> <label for="state">State(required)</label> <input id="state" name="state" class="text"/> </li> <li> <label for="zipcode">Pincode/Zipcode(required)</label> <input id="zipcode" name="zipcode" class="text" /> </li> <li> <input type="image" name="submit" id="imageField" src="images/submit.gif" class="send" /> <div class="clr"></div> </li> </ol> </form> How should i check if form is submitted or not when instead of submit button there is an image ? ----------------------------------------------------------------- Alternative Solution ---------------------------------------------------------------------------- I tried sending it to other page but then i had to send an array back to this page and i didn't know how to send that either.I tried Code: [Select] <?php $error = serialize($error); echo "<meta http-equiv='refresh' content='0;url=about.php?array=".$error."'> "; ?> But even after Code: [Select] $array = unserialize($array); $array contained nothing. Can someone please show me how to send in array in between pages for future ? Hi all So... I am creating an import script for putting contacts into a database. The script we had worked ok for 500kb / 20k row CSV files, but anything much bigger than that and it started to run into the max execution limit. Rather than alter this I wish to create something that will run in the background and work as efficiently as possible. So basically the CSV file is uploaded, then you choose if the duplicates should be ignored / overwritten, and you match up the fields in the CSV (by the first line being a field title row), to the fields in the database. The field for the email address is singled out as this is to be checked for duplicates that already exist in the system. It then saves these values, along with the filename, and puts it all into an import queue table, which is processed by a CRON job. Each batch of the CRON job will look in the queue, find the first import that is incomplete, then start work on that file from where it left off last. When the batch is complete it will update the row to give a pointer in the file for the next batch, and update how many contacts were imported / how many duplicates there were So far so good, but when checking for duplicity it is massively slowing down the script. I can run 1000 lines of the file in 0.04 seconds without checking, but with checking that increases to 14-15 seconds, and gets longer the more contacts are in the db. For every line it tries to import its doing a SELECT query on the contact table, and although I am not doing SELECT * its still adding up to a lot of DB activity. One thought was to load every email address in the contacts table into an array before hand, but this table could be massive so thats likely to be just as inefficient. Any ideas on optimising this process? Hi guys, I am currently receiving a large text file ( > 500mb), once per week which I have been manually splitting then processing to obtain the required CSV files. However, this is taking in the region of 2 to 3 hours. Very soon, these files will be sent daily and I really dont have the time to split and process this everyday I have been playing for a while to try and parse everything properly/automatically with fopen, feof and fgets ( and other 'f' options), but the script never seems to read the file all the way to the end - I assume this is due to memory usage. The data received in the file follows a strict pattern throughout the file which is: Code: [Select] BSNY990141112271112270100000 POO2C35 122354000 DMUS 075 O BX NTY LOLANCSTR 1132 11322 TB LIMORCMSJ 1135 00000000 LICRNFNJN 1140 00000000 H LICRNF 1141H1142H 11421142 T LISDAL 1147H1148H 11481148 T LIARNSIDE 1152H1153 11531153 T LIGOVS 1158 1159 11581159 T LIKTBK 1202 1202H 12021202 T LICARK 1206 1207 12061207 T LIULVRSTN 1214H1215H 12151215 T LIDALTON 1223 1223H 12231223 T LIDALTONJ 1225 00000000 LIROOSE 1229 1229H 12291229 T 2 LTBAROW 1237 12391 TF That is just one record of informaton (1 of around 140,000 records), each record has no fixed amount of lines but each line in each record is fixed to 80 characters and all lines in each record need to have the same unique 'id', at present, Im using an md5 hash of microtime. The first line of every record starts with 'BS' and the last line of each record starts with 'LT' terminating with 'TF'. All the other stuff between also follows a certain pattern of which I can break down effectively. The record above show one train service schedule, hence why each line in each record needs the same unique id. Anyone got any ideas on how I could process such a file effectively?? Many thanks Dave Hi All, Having issues uploading files larger than 1mb. This is what I have currently as default when I ran phpinfo() (working locally on my machine)... upload_max_filesize: 432M post_max_size: 432M memory_limit: 8M max_input_time: 60 max_execution_time: 30 I'm looking for the file to be converted into a blob, it works perfectly fine for files less than 1mb, but doesn't even run the mysql query above that. Any Ideas anyone? include("../../connect.php"); # these settings should help set_time_limit(0); # going in as a blob from now on $stamp = mktime(); $safename = $_FILES['Filedata']['tmp_name']; $filename = $_FILES['Filedata']['name']; $size = $_FILES['Filedata']['size']; $type = $_FILES['Filedata']['type']; $fk = $_REQUEST['fk']; $sqlname = $stamp . "-" . $_FILES['Filedata']['name']; # open and code in $fp = fopen($safename, 'r'); $content = fread($fp, filesize($safename)); $content = addslashes($content); fclose($fp); $insertS = "INSERT INTO $tableb (pal, afield, bfield, cfield, dfield, efield, ffield, ablob) VALUES ('6', '$fk', '$filename', '$size', '$type', '$width', '$height', '$content')"; $insertQ = mysql_query($insertS); print "1"; I have a flash application that talks to upload.php Say I upload a 500mb file; it will obviously take a little while to upload. Will the max_execution_time settings cause this to fail? Its set at 60 right now and the upload is obviously taking longer than 1 minute. Hi guys
Been a while since I've been on here, completely lost my login details for my old account which is a shame because it was a well established one but hey ho that's life.
Anyhow, I've starting a project as of tomorrow and I was wondering if I could get some advice on the best methods/routes to take.
The Brief
+ 3 full websites
+ no single admin
+ one main admin
The Plan
Once all 3 websites are complete none of them will have the standard admin panel that the average website has. All 3 websites (this number is going to grow in time too) will have one central admin held of a separate and isolated server which in essence will *remotely* administrate the websites I connect to it.
The Question
I have never done a project with multiple websites that run from a central admin. My questions are if anyone could shine some light....
Should I make the admin panel audit the website selected by sending commands through api, crud, rest or simply by direct db access?
What is the most secure way of doing this central admin (bare in mind the admin panel will have a minimum of 10 different admin levels/permission sets)?
How should I have the database layout/hierarchy for this (3 completely separate site - again this will grow + a central admin site)?
Last question (that I can think of) - how much of a mammoth task really is this?
Any help would be greatly appreciated!
Thanks guys
James
Edited by jamesmp, 09 July 2014 - 06:12 PM. Hi, i am php programmer , i need help from php expert to create php apllication for large database . I have database table called "profiles" which contains millions(1.5 to 2 million) of profile of the business companies. This table has 10 fields and there is one field named as "bname" which is name of company , i made this column full-text index for full-text search . Now , i have to use this table for profile searching (using full-text search), profiles within particular cities , profiles within particular categories etc. This table contains millions of records so it will take lots of time for searching and fetching the reocrd(s) from this table. Can anybody help me that how can i manage this large table to improve the performance and fast searching with php ? Is there any other technique (algorithm) to manage large database (like facebook,twiiter,orkut)? I have a soap client written up that works well on small requests, but on large ones I face out of memory issues. I want to be able to write the response directly to a file as an xml document, so I can use something like xpath. When I try this even on small responses it seems to go into a loop: $infile = $client->Retrieve($criteria); $outfile = fopen('./sites/all/modules/cvent/data.txt', 'w'); while (!feof($infile)) { fwrite($outfile, fread($infile, 2048)); } how can I put the data into a file straight from the response and still be economical on memory? Hello, Im trying to find a way to check around 500-600 links to check if they are alive. It works fine for 5-6 links but once i add more links it just times out. Is there a way i could process this so it does 1 link at a time or somthing ? <?php include("config.php"); $query = "SELECT * FROM `games` WHERE `r_fileserve` <> \"\" LIMIT 500"; $result = mysql_query($query); while($row=mysql_fetch_assoc($result)) { $link_str = file_get_contents("$row[r_fileserve]"); $pattern = '<input type="hidden" name="download" value="normal"/>'; preg_match($pattern,$link_str,$match); if ($match[0] != null) { echo "Working <br />"; } else { echo "File Down <br />"; } } ?> Hey Guys, I need a solution for uploading very large files. As I found PHP has some memory limits. Is it even possible to upload files with a size of 4GB? The script in question works perfectly on my WAMP installation. It is designed to help a computer-challenged historian publish to the web using text CSV files without her having to use FTP or edit html files. The largest data set is about 400KB. The script is using TEXTAREA form input and uploading via POST. Smaller files upload okay. The larger ones fail with a blank screen (empty html) and usually only a few hundred bytes missing. Example: 380KB data POST fails with 367KB stored on remote host. No error message is saved to the remote host directory. I suspected suhosin.post.max_value_length as it is set to 64K, but more than four times that amount is being stored on the remote host. Can suhosin.post.max_value_length still be the problem? The remote host is running: PHP 5.2.5 Apache 2.2.11 Linux O/S PHPINFO(): max_execution_time - 30 max_input_time - 60 memory_limit - 64M post_max_size - 16M upload_max_filesize - 16M suhosin.post.max_value_length - 65384 Any suggestions much appreciated. I have an application which takes some time and would like to take steps to improve execution speed. Sample data is provided as JSON and is as follows where the values array has few columns and many rows. My desired outcome is three PHP objects for mean_P51, mean_P55, and max_P56 which all have a reference to the time array as well as their own values array. { "name": "L2", "columns": ["time", "mean_P51", "mean_P55", "max_P56"], "values": [ ["2020-06-13T14:02:02Z", 4.3527550826446255, 5.668302919254657, 0.6175362252066116], ["2020-06-13T14:02:12Z", 4.472219604166665, 5.493282520833331, 0.6095558604166668], ["2020-06-23T14:02:22Z", 4.332343173277662, 5.477678517745302, 0.6014520167014615], ... ["2020-06-23T14:02:22Z", 4.272219604166665, 5.468302919254657, 0.6195558604166668] ] } Originally, I thought that it would be more efficient to iterate over the big values array once and on each iteration process each column (I envision myself walking one mile and snapping my fingers three times each foot, or walking three miles and snapping my fingers once every foot). What I witness, however, is it is faster to iterate over the big values array multiple times to generate each object, and I've done a few simple tests comparing iterating big loops within little loops to little loops within big loops, and get similar results. I guess this makes sense I am not really stopping when I snap my fingers and if I did walking three miles would likely be quicker. Is this expected behavior? If there is too much data, I've needed to use a JSON stream parser and either a generator or iterator. I haven't tested it yet, but expect a generator would be more efficient than an iterator and using PHP's built-in json_decode() and an array would be more efficient than either a generator or iterator. Think I need to test this hypothesis or is it likely correct? Any other general strategies one should take when working with large datasets? Thanks Hello All This was an online test, which I didn't do very well with, and am looking for guidance of where I went wrong. The idea is you'd have a matrix $A with values -1, 0, 1 in each position, in rows M and column N where M,N can be up to 1000. The code is evaluated $k times where $k can be up to 1000 as well. Starting at $A[0][0], evaluate the matrix where -1 means you go down a row and +1 means you go right one column. A 0 value means you continue along your previous direction. After exiting a position, that positions value is multiplied by -1: 1 = (-1); (-1) = 1; 0 = 0 Once you exit the matrix either on the right or below, you stop and start over The goal of the code is to return how many times you exited the bottom right corner, ending up below (not to the right) the matrix; When your spot is [M], [N+1] Here is the code I wrote: Code: [Select] function eval_matrix( $A, $k ) { $x = 0; $y = 0; // dir is x or y for up/down // $dir = "y"; $ball_count = 0; // Get columns // $maxX = count($A); // Get rows // $maxY = count($A[0]); // for $k times for($i=0; $i<$k; $i++) { // while we're within the matrix boundaries while($x < $maxX && $y < $maxY) { // get the value of our current position $mode = $A[$x][$y]; // evaluate the value 0,1,-1 switch($mode) { case 0: if($dir =="y") { $y++; } else { $x++; } break; case -1: // Change position value $A[$x][$y] = 1; $y++; $dir = "y"; break; case 1: // Change position value $A[$x][$y] = -1 $x++; $dir = "x"; break; } } if($x == $maxX-1 && $y == $maxY) { $ball_count++; } $x = 0; $y = 0; $dir = "y"; } return $ball_count; } It may be fairly ugly, but I'm looking for help with how to properly code this type of solution. Any pointers, resources, suggestions are greatly appreciated! |