PHP - Can't Resize Large Images With Imagecreatefromjpeg()
I Can't resize large images with imagecreatefromjpeg()
I can load small 38kb images fine, when they get up 780+- or 1.3 mb +- (with a width of 2500px * x) I get the below error I also can upload the same pics in another file with out resizing(using imagecreatefromjpeg()) them and the script works fine. my max file upload size with xampp is 128mb / php5 Warning: imagecreatefromjpeg() [function.imagecreatefromjpeg]: gd-jpeg, libjpeg: recoverable error: Corrupt JPEG data: 191 extraneous bytes before marker 0xd9 in C:\xampp\htdocs\ed\phpsol\ch08\work\includes\create_thumb.inc.php on line 35 Warning: imagecreatefromjpeg() [function.imagecreatefromjpeg]: 'C:\xampp\tmp\php9596.tmp' is not a valid JPEG file in C:\xampp\htdocs\ed\phpsol\ch08\work\includes\create_thumb.inc.php on line 35 I'm basicly using a switch switch($type) { case 1: $source = @ imagecreatefromgif($original); if (!$source) { $result = 'Cannot process GIF files. Please use JPEG or PNG.'; } break; case 2: $source = imagecreatefromjpeg($original); <---- LINE 35 where $original is $original = $_FILES['image']['tmp_name']; break; Similar TutorialsIn the past, whenever I write an image upload script in php that needs to generate a thumbnail or resized version, I have had to make sure the image is a reasonable size before uploading otherwise you get the old 'allowed memory bytes exceeded' thing. What are my options if I want people to be able to upload a full size image from their camera i.e. a 15-20mb 4000x3000px image and then have a thumbnail and something like 500px wide version for displaying on the site? The large unaltered original needs to be stored as well as it will be used for prints. Is this just not possible with PHP? Or is it down to needing a dedicated server? I'm hoping to get a little feedback on what you all believe is the best way to handle this efficiently in PHP. I am working on a script that imports a large amount of data from remote feeds; this facilitates the quick deployment of real estate web sites, but has to download a large number of images to each new site. Assuming for right now that the bottleneck isn't in the method (fsock vs curl vs...) and that for each imported listing we're spending between .89439 and 17.0601 seconds on the image import process alone... what would you suggest for handling this over the space of 100-1000 occurrences? As of right now I have two ideas in mind, both fairly rudimentary in nature. The first idea is to shut the script down every 30-45 seconds, sleep for a second and fire off another asynchronous request to start the script again. The second idea is to fire off a new asynchronous to run the image imports separate from the main script. This would let the efficient ones clear out rather quickly while the slower imports would have their own process to run in. The only thing that worries me about this is the fact that 100 of these could be fired off every second. Even assuming half of them complete before the next round are fired off, they would still pile up. Hi guys, i have a form that uploads a school photo. All i want to do is resize the image to its widest dimension of a width of 480 px and then obviously constrain the height. can anyone help me, ive tried to do this myself but when it comes to image properties i really struggle etc. heres my upload code. Code: [Select] <?php include("includes/connection.php"); // Where the file is going to be placed $schoolimage = "SchoolImages/"; //This path will be stored in the database as it does not contain the filename $currentdir = getcwd(); $path = $currentdir . '/' . $schoolimage; // Get the schoolid for the image and school linker table $schoolid = $_POST['schoolid']; //Get the school name $query = "SELECT * FROM school WHERE school_id = ".$schoolid; $result = mysql_query($query) or die("Error getting school details"); $row = mysql_fetch_assoc($result); $schoolname = $row['name']; //Use this path to store the path of the file in the database. $filepath = $schoolimage . $schoolname; //Create the folder if it does not already exist if(!file_exists('SchoolImages')) { if(mkdir('SchoolImages')) { echo 'Folder ' . 'SchoolImages' . ' created.'; } else { echo 'Error creating folder ' . 'SchoolImages'; } } //Store the folder for the course title. if(!file_exists( $filepath )) { if(mkdir( $filepath )) { echo 'Folder ' . $schoolname . ' created.'; } else { echo 'Error creating folder ' . $schoolname; } } // Where the file is going to be placed $target_path = $filepath; // Add the original filename to our target path. Result is "uploads/filename.extension" echo $target_path = $target_path . '/' . basename( $_FILES['uploadedfile']['name']); if(move_uploaded_file($_FILES['uploadedfile']['tmp_name'], $target_path)) { echo "The file ". basename( $_FILES['uploadedfile']['name'])." has been uploaded"; $filename = $_FILES['uploadedfile']['name']; //Store the filename, path other criteria in the database echo $query = "INSERT INTO image(image_id, name, path) VALUES(0, '$filename', '$filepath')"; //Perform the query $add = mysql_query($query, $conn) or die("Unable to add the image details to the database"); $imageid = mysql_insert_id(); //Store the filename, path other criteria in the database echo $query = "INSERT INTO image_school( image_id, school_id ) VALUES('$imageid', '$schoolid')"; //Perform the query $add = mysql_query($query, $conn) or die("Unable to add the image details to the database"); $message = 'Upload Successful'; } else { $message = 'There was an error uploading the file, please try again!'; } //Close the connection to the database mysql_close($conn); header("Location: add_school_photo_form.php? message={$message}&schoolid={$schoolid}"); //header("Location: add_school_photo_form.php? message=$message, schoolid=$schoolid"); exit(); ?> Id be eternially gratefull. Kind Regards Dean Hey, I was wondering if there is a way to use PHP to resize images when they are a above a size limit? Im not referring to file size here, but rather the width by height ratio. As currently i upload images and echo them with: Code: [Select] style="width:250px;height:200px;" But this means my server is loading large images when i would rather resize them down upon upload to reduce loading times... Is this possible in php ? After working with the sample imagecreatefromjpeg provided in the PHP manual, I successfully got a result (after clearing my cache) from Quoteimagecreatefromjpeg($im, $file); I've gotten a good education after navigating this function over the past week, and loaded it with ECHO messages to give me insight. Everything was going fine. And then, this ONE test image came along. Apparently, the image (which is as good aj peg as I can find) FAILS the if(!im) test. When I used echo $im; i discovered that when images pass through the function, they receive a "Resource" name. Images that FAIL are NOT named. This image gets a Resource name, yet FAILS. Is there a problem with my logic? A problem with the image? What would cause this? How can I verify? i hasve the following code: $filename = "Rainbow-code-1_blck.jpg"; if (file_exists(sfConfig::get('sf_upload_dir') . '/rainbowcode/images/profilepics/'.$filename)) { echo "file found"; $source = imagecreatefromjpeg($filename); } but i get the following warning: Warning: imagecreatefromjpeg(Rainbow-code-1_blck.jpg): failed to open stream: No such file or directory in /home/helloises/traffic_2/phoenix/plugins/rainbowCodePlugin/lib/model/RcProfileTable.php on line 168 i dont understand my file does exist...it goes in to the if cos the echo executes please help i am new to this and have no idea how to fix this thank you Hi,
before posting this problem, I already tried googling and tried those possible solutions that could help but still failed.
I am having a problem with that PHP function
imagecreatefromjpeg(): gd-jpeg: JPEG library reports unrecoverable error:based from what i found in google, some say, use ini_set("gd.jpeg_ignore_warning", 1);failed. some say, use init_set("memory_limit", -1)still failed I am sure that I am passing .jpg image, how come i keep on getting that error ? any suggestions of what to do? Edited by sasori, 23 October 2014 - 08:12 AM. Hi guys I need some info Am doing an upload page for pictures, that crops images to two different sizes which works fine with small images. However some larger images it doesn't work (locally it works but not on the my server provider). I can upload the file (3481 * 2346 at 300 dpi) no problem but it actually never executes the imagecreatefromjpeg($myUploadedFile). Is there any issues with large file. I've been doing some recherches but can't seem to find any answer. The obvious idea is to have people upload photos without asking them to resize them. Thanks Hey guys, The script I use to generate images, more or less, works almost flawlessly. However, I keep experiencing a problem at random in my script when it comes time to call a JPEG or PNG file from an external server. A lot of the time it will work fine, but many other times it comes back with the error: imagecreatefromjpeg() [function.imagecreatefromjpeg]: Cannot read image data Which causes the script to fail. Right now the images it say it cannot read are these: http://tiles.xbox.com/tiles/UT/EF/1mdsb2JgbA9ECgQLGwMfWSkgL2ljb24vMC84MDAwIAABAAAAAPkqMU4=.jpg http://tiles.xbox.com/tiles/Au/lM/1Wdsb2JgbA9ECgUAGwEfL1hTL2ljb24vMC84MDAwIAABAAAAAPpj6R0=.jpg http://tiles.xbox.com/tiles/6q/kv/1Gdsb2JgbA9ECgUAGwEfL1hSL2ljb24vMC84MDAwIAABAAAAAPsAqfU=.jpg http://tiles.xbox.com/tiles/tQ/UG/1Gdsb2JgbA9ECgUAGwEfV1gmL2ljb24vMC84MDAwIAABAAAAAPspBao=.jpg http://tiles.xbox.com/tiles/qp/Fx/0Wdsb2JgbA9ECgQNGwEfVitXL2ljb24vMC84MDAwIAABAAAAAP5ekbU=.jpg and as you can see, they work fine. So what's going on here? Is Microsoft somehow blocking the attempt? I can view the images fine in the browser, but sometimes it just won't work in the script. But like I said, it's not everytime. The line the errors comes up on are these: $lastxboxgames = imagecreatefromjpeg($lastxboxgames); $lastxboxgames1 = imagecreatefromjpeg($lastxboxgames1); $lastxboxgames2 = imagecreatefromjpeg($lastxboxgames2); $lastxboxgames3 = imagecreatefromjpeg($lastxboxgames3); $lastxboxgames4 = imagecreatefromjpeg($lastxboxgames4); I also have the script to echo the variables back to me when while it's running, so those urls up there came exactly from the script, so it's not that its getting the wrong URL, it just decides that the image isn't good enough. The script, generates my signature image below, and when any of the images that come from the xbox server return an error, they all do, including the avatar image in the top left which is a PNG. Any help is appreciated! I have code written for image uploading, but it doesn't allow multiple images on a single upload, and doesn't re-size. Anyone willing to share a good upload script that will do the following?: -Allow multiple image uploads (10+ per submission), -Re-size images on upload, and -Rename images. Thanks Brett Hi all So... I am creating an import script for putting contacts into a database. The script we had worked ok for 500kb / 20k row CSV files, but anything much bigger than that and it started to run into the max execution limit. Rather than alter this I wish to create something that will run in the background and work as efficiently as possible. So basically the CSV file is uploaded, then you choose if the duplicates should be ignored / overwritten, and you match up the fields in the CSV (by the first line being a field title row), to the fields in the database. The field for the email address is singled out as this is to be checked for duplicates that already exist in the system. It then saves these values, along with the filename, and puts it all into an import queue table, which is processed by a CRON job. Each batch of the CRON job will look in the queue, find the first import that is incomplete, then start work on that file from where it left off last. When the batch is complete it will update the row to give a pointer in the file for the next batch, and update how many contacts were imported / how many duplicates there were So far so good, but when checking for duplicity it is massively slowing down the script. I can run 1000 lines of the file in 0.04 seconds without checking, but with checking that increases to 14-15 seconds, and gets longer the more contacts are in the db. For every line it tries to import its doing a SELECT query on the contact table, and although I am not doing SELECT * its still adding up to a lot of DB activity. One thought was to load every email address in the contacts table into an array before hand, but this table could be massive so thats likely to be just as inefficient. Any ideas on optimising this process? Hi guys, I am currently receiving a large text file ( > 500mb), once per week which I have been manually splitting then processing to obtain the required CSV files. However, this is taking in the region of 2 to 3 hours. Very soon, these files will be sent daily and I really dont have the time to split and process this everyday I have been playing for a while to try and parse everything properly/automatically with fopen, feof and fgets ( and other 'f' options), but the script never seems to read the file all the way to the end - I assume this is due to memory usage. The data received in the file follows a strict pattern throughout the file which is: Code: [Select] BSNY990141112271112270100000 POO2C35 122354000 DMUS 075 O BX NTY LOLANCSTR 1132 11322 TB LIMORCMSJ 1135 00000000 LICRNFNJN 1140 00000000 H LICRNF 1141H1142H 11421142 T LISDAL 1147H1148H 11481148 T LIARNSIDE 1152H1153 11531153 T LIGOVS 1158 1159 11581159 T LIKTBK 1202 1202H 12021202 T LICARK 1206 1207 12061207 T LIULVRSTN 1214H1215H 12151215 T LIDALTON 1223 1223H 12231223 T LIDALTONJ 1225 00000000 LIROOSE 1229 1229H 12291229 T 2 LTBAROW 1237 12391 TF That is just one record of informaton (1 of around 140,000 records), each record has no fixed amount of lines but each line in each record is fixed to 80 characters and all lines in each record need to have the same unique 'id', at present, Im using an md5 hash of microtime. The first line of every record starts with 'BS' and the last line of each record starts with 'LT' terminating with 'TF'. All the other stuff between also follows a certain pattern of which I can break down effectively. The record above show one train service schedule, hence why each line in each record needs the same unique id. Anyone got any ideas on how I could process such a file effectively?? Many thanks Dave The script in question works perfectly on my WAMP installation. It is designed to help a computer-challenged historian publish to the web using text CSV files without her having to use FTP or edit html files. The largest data set is about 400KB. The script is using TEXTAREA form input and uploading via POST. Smaller files upload okay. The larger ones fail with a blank screen (empty html) and usually only a few hundred bytes missing. Example: 380KB data POST fails with 367KB stored on remote host. No error message is saved to the remote host directory. I suspected suhosin.post.max_value_length as it is set to 64K, but more than four times that amount is being stored on the remote host. Can suhosin.post.max_value_length still be the problem? The remote host is running: PHP 5.2.5 Apache 2.2.11 Linux O/S PHPINFO(): max_execution_time - 30 max_input_time - 60 memory_limit - 64M post_max_size - 16M upload_max_filesize - 16M suhosin.post.max_value_length - 65384 Any suggestions much appreciated. Hey everybody, I have a script that I think should be working, but it's not...go figure Here's the snippet that is causing and epic fail: <?php foreach($Array as $key => $value){ $$value = $key; } $checkVal = 'someValue'; $output = isset($$checkVal) ? TRUE : FALSE; ?> As you can see it basically sets the value of the array element to a var var and then checks agains an input word. If the input word matched the varName of a set variable, we can then assume that word was in the array and return TRUE. Pretty straight forward and I've tried about 3 different approaches to this, including: in_array and flipping and checking for isset(array['value']). The array that is being checked against is usually upwards of 15000 elements. I would appreciate any knowledge that helps understand any issues in searching large arrays and good ways to get around them, or if it's just an error in my coding/logic, let me know! Thank You all in advance. E I want to allow for a user to upload any photo that they might have taken from their camera. I can't get photo's with large file sizes to upload. I have changed the setting in the php5.ini and set the settings extremely high. This has always worked for me before. I also have changed the code on the form. <input type="hidden" name="MAX_FILE_SIZE" value="99000000" /> here is the code for the php5.ini register_globals = on allow_url_fopen = on expose_php = Off max_input_time = 500 variables_order = "EGPCS" extension_dir = ./ upload_tmp_dir = /tmp precision = 12 SMTP = relay-hosting.secureserver.net url_rewriter.tags = "a=href,area=href,frame=src,input=src,form=,fieldset=" [Zend] zend_extension=/usr/local/zo/ZendExtensionManager.so zend_extension=/usr/local/zo/4_3/ZendOptimizer.so register_long_arrays = on max_file_uploads = 8M post_max_size = 8M Maybe the problem is not in the php5.ini ? Hey Guys, I need a solution for uploading very large files. As I found PHP has some memory limits. Is it even possible to upload files with a size of 4GB? I have a flash application that talks to upload.php Say I upload a 500mb file; it will obviously take a little while to upload. Will the max_execution_time settings cause this to fail? Its set at 60 right now and the upload is obviously taking longer than 1 minute. I have a soap client written up that works well on small requests, but on large ones I face out of memory issues. I want to be able to write the response directly to a file as an xml document, so I can use something like xpath. When I try this even on small responses it seems to go into a loop: $infile = $client->Retrieve($criteria); $outfile = fopen('./sites/all/modules/cvent/data.txt', 'w'); while (!feof($infile)) { fwrite($outfile, fread($infile, 2048)); } how can I put the data into a file straight from the response and still be economical on memory? I need to process a large CSV file (40 MB - 300,000 rows). While I have been working with smaller files my existing code is not able to work with large files. All I need to do is - read a particular column from file and then count total number of rows and add all the values from column. My exisitng piece of code imports whole CSV file into an array (a class is used) and then using 'ForEach' loop, reads the required column and values into another array. Once the data is in this array i can simply sum or count it. While this served me well for smaller files, i am not able to use this approach to read a larger php file. I have already increased the memory allocated to php and max_execution_time but the script just keeps on running I am no php expert but usualy get around with trial and error.......your thoughts and help will be greatly appreciated Exisiting code: Once data has been processed by initial class (Class used is freely available and known as 'parsecsv', available at http://code.google.com/p/parsecsv-for-php/) After calling the class and processing the csv file: ?php ini_set('max_execution_time', 3000); $init = array(0); //Initialize a dummy array, which hold value '0' foreach ($csv->data as $key => $col): //Get value from array, 'Data' is an array processed by class and holds csv data $ColValue = $col[SALARY']; //retrieves the column you want { $SAL= $col['SALARY']; //Column that you want to process from csv array_push ($init, $SAL); // Push value into dummy array created above echo "<pre>"; } endforeach; $total_rows = (Count($init) -1); //Count total number of value, '-1' to remove the first initilaized value in array echo "Total # of rows: ". $total_rows . "\n"; echo "Total Sum: ". array_sum($init) . "\n"; ?> ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Hi guys
Been a while since I've been on here, completely lost my login details for my old account which is a shame because it was a well established one but hey ho that's life.
Anyhow, I've starting a project as of tomorrow and I was wondering if I could get some advice on the best methods/routes to take.
The Brief
+ 3 full websites
+ no single admin
+ one main admin
The Plan
Once all 3 websites are complete none of them will have the standard admin panel that the average website has. All 3 websites (this number is going to grow in time too) will have one central admin held of a separate and isolated server which in essence will *remotely* administrate the websites I connect to it.
The Question
I have never done a project with multiple websites that run from a central admin. My questions are if anyone could shine some light....
Should I make the admin panel audit the website selected by sending commands through api, crud, rest or simply by direct db access?
What is the most secure way of doing this central admin (bare in mind the admin panel will have a minimum of 10 different admin levels/permission sets)?
How should I have the database layout/hierarchy for this (3 completely separate site - again this will grow + a central admin site)?
Last question (that I can think of) - how much of a mammoth task really is this?
Any help would be greatly appreciated!
Thanks guys
James
Edited by jamesmp, 09 July 2014 - 06:12 PM. |