PHP - Take A Screenshot Of A Webpage Using Php
Hi Guys,
Yet again, I was wondering if one of you genious's could help me. I want to be able to take a screenshot of an entire webpage (From the header to footer) using a PHP script. So If I crawled the webpage URL = http://www.google.co.uk/search?q=php+sc ... =en&num=10 I would like to take a screenshot of that entire page from the GOOGLE header to the GOOGLE footer and store it into a TEMPORARY folder on my server and then be able to call it back. Is this possible? I've been searching everywhere for a solution to this but the closest thing I came to was http://mistonline.in/wp/get-youtube-vid ... avascript/ I would be really grateful if one of you could please help me, thank you in advance. M Similar TutorialsHi all... I am facing problems in taking screenshot for the currently open page. What I want is by clicking on a link like "Click for taking Screenshot" the screenshot of that page gets generated, which can be saved further. I have tried by using "imagegrabscreen" & "imagegrabwindow" but both of them results as a big black png image, also searched forums but not able to get the perfect match. If any expert there...then plzzzzz do help me out.... Thank You Hello there! I'm looking for a script that generates website screenshots. I want to use a script on my server bcz, i think, it will work faster than use services like webshotspro.com or any other, and have no restriction on size. I need a script that can cache the screenshot and refresh it every 2 weeks for example. Hope that you can help me on this. Thank you in advance! I have the content of an email in a database, and I need to generate a screenshot of the email based on that.. I'm thinking this won't be possible in PHP and I'll have to use some service? Anyone know where to begin? Without using FFMPEG that is, since my host refuses to allow it. I really don't want to change hosts just for this one thing, I'm happy otherwise, but I need this work. I just need to grab a frame from a video when the user uploads the video. I can't use FFMPEG like I said, but I saw a suggestion someone had that said "have PHP open the video in fullscreen and take a screencap". That would work perfectly, I just have not a CLUE how to do it. Does anyone have experience doing this? I just want it to grab a screenshot and have it saved as a jpg with the same filename as the video, for obvious reasons...call the video player and pass it the filename with a .jpg attached for the screenshot param. Any help is appreciated! Thanks. try { echo "<br>"; foreach($dbh->query("SELECT * FROM test_shot WHERE sold=1 ORDER BY year ASC") as $row) { if($row['picture'] != "" && $row['picture'] != null) { echo "<div class='image-holder'><img src ='".$row['picture']."' width=300px /><br>"; } if($row['year'] != "" && $row['year'] != null) { echo $row['year']; } if($row['description'] != "" && $row['description'] != null) { echo $row['description']; } if($row['sold'] == 1) { echo "<img src='images/sold1.png'><br>";//Add your image code here } elseif ($row['sold'] == 0) { echo "</div><br>"; } } } catch (PDOException $e) { print $e->getMessage(); } ?>
Hi. I found this code for my website, and it work well. Its just that there comes some text I wan't to delete. Code: Code: [Select] <?php /* index.php */ Session_start(); if($_SESSION['login'] == false) { header("Location:login.php"); } $a = $_GET['a']; $source='http://****.elementfx.com/test.php'; //$source='sample.txt'; $page_all = file_get_contents($source); $div_array=array(); preg_match_all('#<div id="intro">(.*?)</div>#sim', $page_all, $div_array); //print_r($div_array); ?> <html> <head> <title>Home</title> </head> <body> <center> <p><b><font color="blue" size="20">*****</font></b> <font color="blue" size="2">version 0.9_01</font></p> <br/> <br/> <br/> <textarea cols="50" rows="10"><?php print_r($div_array[1]);?></textarea> </center> </body> </html> The text it should get is: Quote Hello I'm Something! <p>asdoasduiasdasnda</p> asdasdaksdjas<br/> sdffdsg But the output is: Quote Array ( => Hello I'm Something! <p>asdoasduiasdasnda</p> asdasdaksdjas<br/> sdffdsg ) I need to get rid of the Array( ... thing.. Regards Worqy Hi All... I am looking for a way of saving currently open webpage as a pdf file.........I searched for that and even gone for this site http://www.tcpdf.org. But there also I didn't find any perfect way of doing this all....If somebody else has tried tcpdf for that or done something like that, then plzzzzz plzzz plzzzz do help me....Thanx...... Just need a correction on this please, there is a Parse error, think I got the quote marks wrong somewhere? echo "<li>" . <a href="/stock/$stock.htm"> .$Stock . ": " . $Name . "</a></li>\n"; Thanks So we just started php after learning a month of HTML, and we need to create a website with a page that allows the admin to change the colors and font size, header, etc on the webpage itself via forms. E.g. I'm on the site and I can use a drop down menu to change the background to "blue" for example. No need to go edit the CSS file or anything. How can be this done? My second and final question is how would I write code that allows my website to have users "log in" to their account so they can edit their page? Thanks a lot, I'm not good at PHP yet I've just started so please keep that in mind:) - Kranti Hi, I want to save an webpage into my server location, as simple as that. At present, I am reading the content of the file using curl, save that content into a text file, then save that text file. I assume there must be some straightforward way just to save www.example.com/file.html into my server directory as file.html. Can anyone help me on this, please? Thanks, -Abd Hey agen i have problem i need info from other webpage Code: [Select] <?php // get as ? $url=$_GET['url']; // value of script $website = $url; $referer = 'olar.eu'; $useragent = 'Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)'; $filename = basename($url); // connect $curl_handle=curl_init(); curl_setopt($curl_handle,CURLOPT_USERAGENT,$useragent); curl_setopt($curl_handle,CURLOPT_AUTOREFERER,$referer); curl_setopt($curl_handle,CURLOPT_URL,$website); curl_setopt($curl_handle,CURLOPT_CONNECTTIMEOUT,5); curl_setopt($curl_handle,CURLOPT_RETURNTRANSFER,1); $source = curl_exec ($curl_handle); curl_close ($curl_handle); $a=preg_match("/>Level:.*.\n.*.>([0-9]*)</",$source,$b); $Level = $b[1]; echo "Level: $Level"; ?>i get only empty value url=http://www.gamersfirst.com/warrock/?q=Player&nickname=admin-b13r I've used get_file_contents() before but it seems like it took a long time to load... what is the best or most efficient way to get certain content from a webpage? Like all the posts on a single forum page for example, without loading any images or styles, just the "source" There are different methods to catch links in <a> tag from a webpage. But surprisingly, with none of them I was successful to catch all links from e.g. http://www.amazon.com/Notebooks-Laptop-Computers/b/ref=sa_menu_lapnet6?ie=UTF8&node=565108 All links are captured except a series in the main area; e.g. http://www.amazon.com/s/ref=amb_link_85318851_3?ie=UTF8&node=565108&field-availability=-1&brand=acer&emi=ATVPDKIKX0DER&pf_rd_m=ATVPDKIKX0DER&pf_rd_s=center-6&pf_rd_r=1HET3VVW5SXMJ0SRKJVM&pf_rd_t=101&pf_rd_p=1291722382&pf_rd_i=565108 http://www.amazon.com/s/ref=amb_link_85318851_22?ie=UTF8&node=1232596011&brand=samsung&pf_rd_m=ATVPDKIKX0DER&pf_rd_s=center-6&pf_rd_r=1HET3VVW5SXMJ0SRKJVM&pf_rd_t=101&pf_rd_p=1291722382&pf_rd_i=565108 is there some way of checking the pagename in php? for example, home.php also on a side question does anyone know how to autoplay a .avi video without showing anything but the actual video.. (im meaning i dont want to see the progress bar or play, stop, pause buttons etc) all help would be great. Hey all, What's the most efficient way to wait until a page on your own website is done being rendered, and then parse it for something specific? The reason I'm having to scrape it rather than just generate it myself is because the part being scraped if being generated in an iframe on my site via another site, and the data inside of it is dynamic. Thanks Hello im using mybb forum, and i have another webpage what i want to access with forum username and password. Dont know how to make it work.
I have a particular PHP file which is publicly located, however, I don't want anyone but me to access. Below are my thoughts how to do so. Please comment.
Use an uncommon name, and definitely not index.php.
Either include a file called index.html in the same directory, or set up Apache not to show them using Options -Indexes, or maybe both for good measure.
Require some variable to be set to a given value in either the GET or POST array, and if not set, throw a 404 header and display the 404 missing file HTML. Hi, I kow this is probably something simple, but i have 2 links to a RSS feed which i need to put up on a web page. Not just a link, but the actual feed rendered with a number of news headlines. One is a ".rss2.XML" link and the ither is a html link. the HTML i suppose is static, so probably not any good. thanks in advance I want to convert a webpage from my system using PHP into a PDF file. Just how http://www.web2pdfconvert.com/ works. However, I want to have a link that says 'Convert to PDF' then it automatically converts the webpage into a PDF. Is there any way to do this? I was wondering how I could have a PHP script that would run on a special directory each day to see if there are any new entries to a MySQL database. If there were new entries it would create a new web directory using the name of the new entry with a default index.php inside. I have googled around and cant seem to find out how to do it.
|