PHP - Proxy Support On Multi Curl
I have an array of proxies and I need one to be chosen randomly and applied to each curl handle. This script runs successfully when proxy support is removed, but as soon as I try to add a proxy it refuses to connect. My proxies are good. I have private proxies that validate by confirming that the IP address of the requester is on a whitelist of IPs I configured. I've also tried working public proxies with no validation, with the same result. The error it returns is that it "cannot connect to google.com:3238". Why it's trying to connect to that port is beyond me, the proxies I use, their port is 3238, but it should be connecting to IP:PORT then Google.com:80. Not sure why it's mixing them up. Here's my code:
Code: [Select] function multi($urls,$proxies,$agents,$referer) { // create the multi curl handle $mh = curl_multi_init(); $handles = array(); $i = 0; foreach ($urls as $url) { // create a new single curl handle $ch = curl_init(); // setting several options like url, timeout, returntransfer // simulate multithreading by calling the wait.php script and sleeping for $rand seconds curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_HEADER, 1); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_PROXY, $proxies[rand(0,count($proxies))]); curl_setopt($ch, CURLOPT_HTTPPROXYTUNNEL, 1); curl_setopt($ch, CURLOPT_COOKIEJAR, 'cookies.txt'); curl_setopt($ch, CURLOPT_COOKIEFILE, 'cookies.txt'); curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 30); curl_setopt($ch, CURLOPT_REFERER, $referer); curl_setopt($ch, CURLOPT_USERAGENT, $agent); // add this handle to the multi handle curl_multi_add_handle($mh,$ch); // put the handles in an array to loop this later on $handles[] = $ch; $i++; } // execute the multi handle $running=null; do { curl_multi_exec($mh,$running); // added a usleep for 0.25 seconds to reduce load usleep (250000); } while ($running > 0); echo 'Curl error: ' . curl_error($ch)."<br />".print_r(curl_getinfo($ch)); // get the content of the urls (if there is any) for($i=0;$i<count($handles);$i++) { // get the content of the handle $output[] = curl_multi_getcontent($handles[$i]); // remove the handle from the multi handle curl_multi_remove_handle($mh,$handles[$i]); } // close the multi curl handle to free system resources curl_multi_close($mh); // echo the output to the screen return $output; } I only post on forums as a last resort. Typically a few Google searches solves the problem, but I've spent too much time researching this one with no progress. I'm assuming it's simply a parameter that needs to be added to the CURL handles, but I've tried every proxy related config option out there and still no dice. Similar Tutorialsgood day dear community, i am workin on a Curl loop to fetch multiple pages: i have some examples - and a question: Example: If we want to get information from 3 sites with CURL we can do it like so: $list[1] = "http://www.example1.com"; $list[2] = "ftp://example.com"; $list[3] = "http://www.example2.com"; After creating the list of links we should initialize the cURL multi handle and adding the cURL handles. $curlHandle = curl_multi_init(); for ($i = 1;$i <= 3; $i++) $curl[$i] = addHandle($curlHandle,$list[$i]); Now we should execute the cURL multi handle retrive the content from the sub handles that we added to the cURL multi handle. ExecHandle($curlHandle); for ($i = 1;$i <= 3; $i++) { $text[$i] = curl_multi_getcontent ($curl[$i]); echo $text[$i]; } In the end we should release the handles from the cURL multi handle by calling curl_multi_remove_handle and close the cURL multi handle! If we want to another Fetch of sites with cURL-Multi - since this is the most pretty way to do it! Well I am not sure bout the string concatenation. How to do it - Note I want to fetch several hundred pages: see the some details for this target-server sites - /(I have to create a loop over several hundred sites). * siteone.example/?show_subsite=9009 * siteone.example/?show_subsite=9742 * siteone.example/?show_subsite=9871 .... and so on and so forth Question: How to appy this loop into the array of the curl-multi? <?php /************************************\ * Multi interface in PHP with curl * * Requires PHP 5.0, Apache 2.0 and * * Curl * ************************************* * Writen By Cyborg 19671897 * * Bugfixed by Jeremy Ellman * \***********************************/ $urls = array( "siteone", "sitetwo", "sitethree" ); $mh = curl_multi_init(); foreach ($urls as $i => $url) { $conn[$i]=curl_init($url); curl_setopt($conn[$i],CURLOPT_RETURNTRANSFER,1);//return data as string curl_setopt($conn[$i],CURLOPT_FOLLOWLOCATION,1);//follow redirects curl_setopt($conn[$i],CURLOPT_MAXREDIRS,2);//maximum redirects curl_setopt($conn[$i],CURLOPT_CONNECTTIMEOUT,10);//timeout curl_multi_add_handle ($mh,$conn[$i]); } do { $n=curl_multi_exec($mh,$active); } while ($active); foreach ($urls as $i => $url) { $res[$i]=curl_multi_getcontent($conn[$i]); curl_multi_remove_handle($mh,$conn[$i]); curl_close($conn[$i]); } curl_multi_close($mh); print_r($res); ?> I look forward to your ideas. I need a proxy that would enable me to use curl with another ip address. How do I find a paid proxy server that supports curl? Hello, hello i need use curl with proxy. but i get "undefined variable offset 1" error . here is the standart code i use. please help me about it thanks curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_HEADER, $header); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_PROXY, $proxy); curl_setopt($ch, CURLOPT_HTTPPROXYTUNNEL, 1); curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout); curl_setopt($ch, CURLOPT_REFERER, $referer); curl_setopt($ch, CURLOPT_USERAGENT, $agent); Received HTTP code 403 from proxy after CONNECT Code: [Select] <?php function getPage($proxy, $url, $referer, $agent, $header, $timeout) { $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_HEADER, $header); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_PROXY, $proxy); curl_setopt($ch, CURLOPT_HTTPPROXYTUNNEL, 1); curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout); curl_setopt($ch, CURLOPT_REFERER, $referer); curl_setopt($ch, CURLOPT_USERAGENT, $agent); $result['EXE'] = curl_exec($ch); $result['INF'] = curl_getinfo($ch); $result['ERR'] = curl_error($ch); curl_close($ch); return $result; } $result = getPage( '89.106.13.93:80', // use valid proxy 'http://www.northplanet.co.uk', 'http://www.youtexv.com/', 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.8) Gecko/2009032609 Firefox/3.0.8', 1, 5); print_r ($result); ?> Iv'e been looking in to methods of scraping data from pages and has found several examples of using multi-curl to achieve this. But i am not used to curl and is not completely sure how it works and i need to find the fastest reliable (i do need all, or close to all, pages every run) method of getting the content of a number of pages (about 160). Here is an example i got from searching the web which i managed to implement: <?php /** * *@param $picsArr Array [0]=> [url], *@$picsArr Array will filled with the image data , you can use the data as you want or just save it in the next step. **/ function getAllPics(&$picsArr){ if(count($picsArr)<=0) return false; $hArr = array();//handle array foreach($picsArr as $k=>$pic){ $h = curl_init(); curl_setopt($h,CURLOPT_URL,$pic['url']); curl_setopt($h,CURLOPT_HEADER,0); curl_setopt($h,CURLOPT_RETURNTRANSFER,1);//return the image value array_push($hArr,$h); } $mh = curl_multi_init(); foreach($hArr as $k => $h) curl_multi_add_handle($mh,$h); $running = null; do{ curl_multi_exec($mh,$running); }while($running > 0); // get the result and save it in the result ARRAY foreach($hArr as $k => $h){ $picsArr[$k]['data'] = curl_multi_getcontent($h); } //close all the connections foreach($hArr as $k => $h){ $info = curl_getinfo($h); preg_match("/^image\/(.*)$/",$info['content_type'],$matches); echo $tail = $matches[1]; curl_multi_remove_handle($mh,$h); } curl_multi_close($mh); return true; } ?> Since time is critical in my script i would ask if you think this is a good implementation or if you can point me in the direction of one that will save me noticeable run-time. i want to do it with cURL-Multi since this is the most pretty way to do it! Well i am not sure bout the string concatenation. How to do it - Note i want to fetch several hundred pages: see the some details for this target-server sites - /(i have to create a loop over several hundred sites) siteone_dot_com?show_subsite=9009 siteone_dot_com?show_subsite=9742 siteone_dot_com?show_subsite=9871 and so on and so fort how to appy this loop into the array of the curl-multi <?php /************************************\ * Multi interface in PHP with curl * * Requires PHP 5.0, Apache 2.0 and * * Curl * ************************************* * Writen By Cyborg 19671897 * * Bugfixed by Jeremy Ellman * \***********************************/ $urls = array( "siteone", "sitetwo ", "sitethree" ); $mh = curl_multi_init(); foreach ($urls as $i => $url) { $conn[$i]=curl_init($url); curl_setopt($conn[$i],CURLOPT_RETURNTRANSFER,1);//return data as string curl_setopt($conn[$i],CURLOPT_FOLLOWLOCATION,1);//follow redirects curl_setopt($conn[$i],CURLOPT_MAXREDIRS,2);//maximum redirects curl_setopt($conn[$i],CURLOPT_CONNECTTIMEOUT,10);//timeout curl_multi_add_handle ($mh,$conn[$i]); } do { $n=curl_multi_exec($mh,$active); } while ($active); foreach ($urls as $i => $url) { $res[$i]=curl_multi_getcontent($conn[$i]); curl_multi_remove_handle($mh,$conn[$i]); curl_close($conn[$i]); } curl_multi_close($mh); print_r($res); ?> Hey guys, Recently I bumped into a little coding problem. I'm trying to submit information to a multi-part form (including an image upload). The receiving server couldn't read my input, figured I had a nill object somewhere (RoR). First I thought the problem was in my curl setopts, but I think they are fine. It looks like the problem is with the data I'm submitting. I read a bit about multi part posting, and every article I read advised to make a postfield array, which curl would interpret and send out correctly. I did and this was my resulting array: $postfield = array ("social[comments]" => "Test", "avatar_file" => "", "friend_avatar_file" => "", "forum_avatar_file" => "", "social[signature]" => "", "forum_signature_file" => "", "save_profile.x" => "25", "save_profile.y" => "9"); That failed. So I tried making it into a URLENCODED string: $postfield = "social%5Bcomments%5D=".stripslashes($_POST['textarea'])."&avatar_file=&friend_avatar_file=&forum_avatar_file=".$file."&social%5Bsignature%5D=&forum_signature_file=&save_profile.x=25&save_profile.y=9"; This finally worked. I was able to submit the info I wanted. I'd like to send a file with it this time though. I'd need to put this into a postfield array to work though. if(!$file = upload( )) { $file = ''; } else { $file = '@'.$file; } function upload( ) { if(isset($_FILES['file']) && !empty($_FILES['file']['name'])) { $uploaddir = "temp"; if(is_uploaded_file($_FILES['file']['tmp_name'])) { move_uploaded_file($_FILES['file']['tmp_name'],$uploaddir.'/'.$_FILES['file']['name']); $searchfile = $uploaddir.'/'.$_FILES['file']['name']; return $searchfile; } } return false; } So, I got the uploading covered. I need to put generate the postfield array though. I have the feeling the problem is with the brackets [ ] I'm using. I tried urlencoding the keys in the array, but that didn't work either. Does anybody have an idea on how to handle brackets in post (multi-part) data? Thanks in advance! Hi - good evening! thanks for the answer and all the help so far Oncemore the "Fetching-part" for a little parser-script: Here we go - new target urls: see the overview: http://dms-schule.bildung.hessen.de/index.html http://dms-schule.bildung.hessen.de/suchen/suche_schul_db.html Search by pressing the button "type" and then choose all schools with the mouse! Results 2400 schools: Here i can provide some "more help for getting the target!" - btw: see some details for this target-server: http://dms-schule.bildung.hessen.de/suchen/suche_schul_db.html?show_school=9009 http://dms-schule.bildung.hessen.de/suchen/suche_schul_db.html?show_school=9742 http://dms-schule.bildung.hessen.de/suchen/suche_schul_db.html?show_school=9871 well - you see i have to iterate over the sites - with a function /(a loop) http://dms-schule.bildung.hessen.de/suchen/suche_schul_db.html?show_school=1000 to 10000 BTW - after fetching the page i have to see which one are empty - those ones do not need to be parsed! Well - i want to do this with curl-multi since this is the most advanced way to do this: I see i have an array that can be filled I have to try it out...!! Well, I'm not pretty sure - but judging by the low level of PHP i have: I think that i may need a double array to define the URLs like: Can i do like so?: $urls = array( "siteone" => "suche_schul_db.html?show_school=%i", "sitetwo" => "suche_schul_db.html?show_school=%i", "sitethree" =>"suche_schul_db.html?show_school=%i" ); $params = array ( for ($i = 1; $i <= 10000; $i++) { // body of loop } // well i have to define the variables in a open form like abvove nand not below: // better than doing it like so: "siteone" => array(9009, 9742, 9871), ); then pass them to curl-multi as: foreach ($urls as $id => $url) { foreach ($params[$id] as => $param) { $i = $id . $param; $finalurl = sprintf($url, $param); $conn[$i] = curl_init($finalurl); I don't know if that's fits the needs tip that i am looking for but I hope so ;-) Can i do like so [see above]? What do yu think. i send you many greetings martin hello dear php-friends i currently work on a little parser project i have to find solutions for the a. fetching part b. parser part here we go - the target urls: see the overview: http://dms-schule.bildung.hessen.de/index.html http://dms-schule.bildung.hessen.de/suchen/suche_schul_db.html Search by pressing the button "type" and then choose all schools with the mouse! Results 2400 schools Here i can provide some "more help for getting the target!" - btw: see some details for this target-server: http://dms-schule.bildung.hessen.de/suchen/suche_schul_db.html?show_school=9009 http://dms-schule.bildung.hessen.de/suchen/suche_schul_db.html?show_school=9742 http://dms-schule.bildung.hessen.de/suchen/suche_schul_db.html?show_school=9871 well - you see i have to itterate over the sites - with a function /(a loop) http://dms-schule.bildung.hessen.de/suchen/suche_schul_db.html?show_school=1000 to 10000 BTW - after fetching the page i have to see which one are empty - those ones do not need to be parsed! Well - i want to do this with curl-multi since this is the most advanced way to do this: I see i have an array that can be filled -... but i have to think about the string-concatenation - i guess that i have make some sophisticated string concatenation. this one does not fit - for($i=1;$i<=$match[1];$i++) { $url = "http://www.example.com/page?page={$i}"; and besides this i have an array - i c an fill the array. can you help me how to run in a loop with <?php /************************************\ * Multi interface in PHP with curl * * Requires PHP 5.0, Apache 2.0 and * * Curl * ************************************* * Writen By Cyborg 19671897 * * Bugfixed by Jeremy Ellman * \***********************************/ $urls = array( "http://www.google.com/", "http://www.altavista.com/", "http://www.yahoo.com/" ); $mh = curl_multi_init(); foreach ($urls as $i => $url) { $conn[$i]=curl_init($url); curl_setopt($conn[$i],CURLOPT_RETURNTRANSFER,1);//return data as string curl_setopt($conn[$i],CURLOPT_FOLLOWLOCATION,1);//follow redirects curl_setopt($conn[$i],CURLOPT_MAXREDIRS,2);//maximum redirects curl_setopt($conn[$i],CURLOPT_CONNECTTIMEOUT,10);//timeout curl_multi_add_handle ($mh,$conn[$i]); } do { $n=curl_multi_exec($mh,$active); } while ($active); foreach ($urls as $i => $url) { $res[$i]=curl_multi_getcontent($conn[$i]); curl_multi_remove_handle($mh,$conn[$i]); curl_close($conn[$i]); } curl_multi_close($mh); print_r($res); ?> Hi I need a script to hide IP address with proxy and read a web page
$username="myuser"; The script doesn't work, it doesn't show me the page output. Any solution? using curl, ive managed to get my program to log me into a proboards site. I can view the main forum page. The problem is, the links to viewing the page is something like href="index.cgi?board=general&thread=1111&page=45" I did a str_replace to replace the index.cgi to href= "link_processor?board=general&thread=1111&page=45" The idea was that link_processor would contain the data "board=general&thread=1111&page=45" However, i now realise that the way the php would see that as 4 different get variables link processor = board=general thread = 1111 page = 45 How could i make it all part of the link_processor variable because if i can keep the string intact, i just have to pass it to a curl function and i can display the page easily! Could someone give me a cross-domain proxy script? I am trying to post data to mysql databases on two servers. I have this PHP script to fetch whois information of domain. It works, but when I try to connect whois server via proxy, then it doesnt work. The proxy ip is taken from proxylist.hidemyass.com. What I do wrong? Thank you for help.
$server = "whois.nic.cz"; $domain = "klikzone.cz"; function QueryWhoisServer($server, $domain){ $proxy = "85.111.25.189:8080"; $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $server); curl_setopt($ch, CURLOPT_PORT, 43); curl_setopt($ch, CURLOPT_PROXY, $proxy); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1); curl_setopt($ch, CURLOPT_HEADER, 1); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_TIMEOUT, 30); curl_setopt($ch, CURLOPT_CUSTOMREQUEST, $domain . "\r\n"); $data = curl_exec($ch); curl_close($ch); return $data; } Hi, I'm trying to understand any how I can block all users trying to view my website through proxies. With the following code, what I have done is a quick version through php (with headers and ports) and not the firewall which isn't exactly the best way but still stops a lot of them. <?php $user_ip = $_SERVER['REMOTE_ADDR']; $headers = array('CLIENT_IP','FORWARDED','FORWARDED_FOR','FORWARDED_FOR_IP','VIA','X_FORWARDED','X_FORWARDED_FOR','HTTP_CLIENT_IP','HTTP_FORWARDED','HTTP_FORWARDED_FOR','HTTP_FORWARDED_FOR_IP','HTTP_PROXY_CONNECTION','HTTP_VIA','HTTP_X_FORWARDED','HTTP_X_FORWARDED_FOR'); foreach ($headers as $header) { if (isset($_SERVER[$header])) { header("Location: /proxy-not-allowed/"); die; } } $queryIP = "SELECT `user_ip_address` FROM `my_table` WHERE `user_ip_address` = :user_ip_address AND `user_blocked` = :user_blocked LIMIT 1"; $queryIP1 = $pdo->prepare($queryIP); $queryIP1->execute(array(':user_ip_address' => $user_ip, ':user_blocked' => 'No')); $queryIP2 = $queryIP1->rowCount(); if ($queryIP2 === 0) { $ports = array(80, 81, 553, 554, 1080, 3128, 4480, 6588, 8000, 8080); foreach ($ports as $port) { $connection = @fsockopen($user_ip, $port, $errno, $errstr, 0.1); if (is_resource($connection)) { header("Location: /proxy-not-allowed/"); die; } } } ?> The headers script blocks any proxy sending those headers while the ports script blocks those using any assigned ports I add. I have tested this which seems to be good, though it won't block all proxies due to the assigned one I have. Is this the best way to go about blocking scripts if I don't have access to the firewall? What I am trying to do is allow users to view my HTTPS website normally and block all proxies. Even if I have some users blocked, I do not want them to be cheeky and use a proxy or even register on my website through a proxy. I was thinking of just using the 443 port as my website is https (is that wise?). Any advice would be great. Edited January 4, 2019 by Cobra23 Hi guys, I am creating a script as I am using this to detection the proxy server levels. <?php //proxy levels //Level 3 Elite Proxy, connection looks like a regular client //Level 2 Anonymous Proxy, no ip is forworded but target site could still tell it's a proxy //Level 1 Transparent Proxy, ip is forworded and target site would be able to tell it's a proxy if(!$_SERVER['HTTP_X_FORWARDED_FOR'] && !$_SERVER['HTTP_VIA'] && !$_SERVER['HTTP_PROXY_CONNECTION']){ echo '3'; } elseif(!$_SERVER['HTTP_X_FORWARDED_FOR']){ echo '2'; } else echo '1'; ?> I want the script to check the ip that if the proxy server is a Codeen/PlanetLab and BotNet proxy servers, then place on level one and if they are safe/unsafe to use. I cannot find the code to do the methods. Please help me! Thanks in advance. Guys, I recently came across this about using a tor socks proxy as a default proxy server in my local home network. So, to my centos-box, I've set the service up with a default soks-port 9050 and the local ip address of this machine is 10.10.1.5. Here's a part of the tor's config file:
SocksPort 10.10.1.5:9050
[jazz@centos-box ~]$ top -u jazz | grep tor 3413 jazz 20 0 76256 32m 9720 S 0.0 0.3 0:01.55 tor [jazz@centos-box ~]$ nmap -Pn 10.10.1.5 | grep 9050 9050/tcp open tor-socks Now, I'm completely able to use that socks proxy from the centos-box with my default browser / curl or whatever you want to be, but if I go to my laptop and set the proxy-socket up to its browser, I've got a message of "TOR is not an HTTP proxy" and half or more ( not all of them ) of my bookmarks web-sites don't work. However, a message when I'm running this service says: Sep 12 13:16:01.769 [notice] You configured a non-loopback address '10.10.1.5:9050' for SocksPort. This allows everybody on your local network to use your machine as a proxy. Make sure this is what you wanted. Sep 12 13:16:01.769 [notice] Opening Socks listener on 10.10.1.5:9050 Ideas? Edited by jazzman1, 12 September 2014 - 12:39 PM. I was wondering if there was a way or if it's even possible to determine the type of a proxy using php. When I say type I mean http, socks4 or socks5. Using cURL I think it's safe to assume that if a proxy returns a code of 200 then that proxy is good and http, correct? However, how would I go about determining the type of proxies I have in a list, assuming they are good and socks4 and/or socks5? Using Slim to route endpoints to my application. In addition, I have many endpoints (mostly accessed via xhr) which need to be forwarded to another server, and I am using Guzzle to do so. Note only do I have to transfer text/json, I also have to send and retrieve files (currently only csv files, but will later add pdf). Accomplishing this was easier than I expected, but expect I may still be doing certain portions wrong. Anything look off, especially with the multipart forms for file uploads as well as the downloading of files? Thank you <?php $app = new \Slim\App($container); //Local requests $app->get('/settings', function (Request $request, Response $response) { return $this->view->render($response, 'somePage.html',$this->bla->getData()); }); //more local endpoints... $proxyEndpoints=[ '/bla'=>['put'], '/bla/bla/{id:[0-9]+}'=>['delete','put'], '/foo/{id:[0-9]+}'=>['get','put','delete','post'], //more proxy endpoints... ]; foreach ($proxyEndpoints as $route=>$methods) { foreach ($methods as $method) { $app->$method($route, function(Request $request, Response $response) { return $this->remoteServer->proxy($request, $response); //add content type if desired. }); } } <?php class RemoteServer { protected $httpClient, $contentType; public function __construct(\GuzzleHttp\Client $httpClient, string $contentType='application/json') { $this->httpClient=$httpClient; $this->contentType=$contentType; } public function proxy(\Slim\Http\Request $request, \Slim\Http\Response $response, string $contentType=null, \Closure $callback=null):\Slim\Http\Response { $contentType=$contentType??$this->contentType; if($contentType!=='application/json' && $callback) { throw new \Exception('Callback can only be used with contentType application/json'); } $method=$request->getMethod(); $bodyParams=in_array($method,['PUT','POST'])?(array)$request->getParsedBody():[]; //Ignore body for GET and DELETE methods $queryParams=$request->getQueryParams(); $data=array_merge($queryParams, $bodyParams); ///Would be better to write slim's body to guzzle's body so that get parameters are preserved and not overriden by body parameters. $path=$request->getUri()->getPath(); $contentTypeHeader=$request->getContentType(); if(substr($contentTypeHeader, 0, 19)==='multipart/form-data'){ syslog(LOG_INFO, 'contentType: '.$contentTypeHeader); $files = $request->getUploadedFiles(); $multiparts=[]; $errors=[]; foreach($files as $name=>$file) { if ($error=$file->getError()) { $errors[]=[ 'name'=> $name, 'filename'=> $file->getClientFilename(), 'error' => $this->getFileErrorMessage($error) ]; } else { $multiparts[]=[ 'name'=> $name, 'filename'=> $file->getClientFilename(), 'contents' => $file->getStream(), 'headers' => [ //Not needed, right? 'Size' => $file->getSize(), 'Content-Type' => $file->getClientMediaType() ] ]; } } if($errors) return $response->withJson($errors, 422); $multiparts[]=[ 'name'=> 'data', 'contents' => json_encode($data), 'headers' => ['Content-Type' => 'application/json'] ]; $options=['multipart' => $multiparts]; } else { $options = in_array($method,['PUT','POST'])?['json'=>$data]:['query'=>$data]; } try { $curlResponse = $this->httpClient->request($method, $path, $options); } catch (\GuzzleHttp\Exception\RequestException $e) { //Errors only return JSON //Networking error which includes ConnectException and TooManyRedirectsException syslog(LOG_ERR, 'Proxy error: '.$e->getMessage()); if ($e->hasResponse()) { $curlResponse=$e->getResponse(); return $response->withJson(json_decode($curlResponse->getBody()), $curlResponse->getStatusCode()); } else { return $response->withJson($e->getMessage(), $e->getMessage()); } } $statusCode=$curlResponse->getStatusCode(); switch($contentType) { case 'application/json': //Application and server error messages will be returned. Consider hiding server errors. $content=json_decode($curlResponse->getBody()); if($callback) { $content=$callback($content, $statusCode); } return $response->withJson($content, $statusCode); case 'text/html': case 'text/plain': //Application and server error messages will be returned. Consider hiding server errors. $response = $response->withStatus($statusCode); return $response->getBody()->write($curlResponse->getBody()); case 'text/csv': foreach ($response->getHeaders() as $name => $values) { syslog(LOG_INFO, "headers: $name: ". implode(', ', $values)); } if($statusCode===200) { return $response->withHeader('Content-Type', 'application/force-download') ->withHeader('Content-Type', 'application/octet-stream') ->withHeader('Content-Type', 'application/download') ->withHeader('Content-Description', 'File Transfer') ->withHeader('Content-Transfer-Encoding', 'binary') ->withHeader('Content-Disposition', 'attachment; filename="data.csv"') ->withHeader('Expires', '0') ->withHeader('Cache-Control', 'must-revalidate, post-check=0, pre-check=0') ->withHeader('Pragma', 'public') ->withBody($curlResponse->getBody()); } else { return $response->withJson(json_decode($curlResponse->getBody()), $statusCode); } break; default: throw new \Exception("Invalid proxy contentType: $contentType"); } } private function getFileErrorMessage($code){ switch ($code) { case UPLOAD_ERR_INI_SIZE: $message = "The uploaded file exceeds the upload_max_filesize directive in php.ini"; break; case UPLOAD_ERR_FORM_SIZE: $message = "The uploaded file exceeds the MAX_FILE_SIZE directive that was specified in the HTML form"; break; case UPLOAD_ERR_PARTIAL: $message = "The uploaded file was only partially uploaded"; break; case UPLOAD_ERR_NO_FILE: $message = "No file was uploaded"; break; case UPLOAD_ERR_NO_TMP_DIR: $message = "Missing a temporary folder"; break; case UPLOAD_ERR_CANT_WRITE: $message = "Failed to write file to disk"; break; case UPLOAD_ERR_EXTENSION: $message = "File upload stopped by extension"; break; default: $message = "Unknown upload error"; break; } return $message; } public function callApi(\GuzzleHttp\Psr7\Request $request, array $data=[]):\GuzzleHttp\Psr7\Response { try { $response = $this->httpClient->send($request, $data); } catch (\GuzzleHttp\Exception\ClientException $e) { $response=$e->getResponse(); } catch (\GuzzleHttp\Exception\RequestException $e) { //Networking error which includes ConnectException and TooManyRedirectsException if ($e->hasResponse()) { $response=$e->getResponse(); } else { $response=new \GuzzleHttp\Psr7\Response($e->getCode(), [], $e->getMessage()); } } catch (\GuzzleHttp\Exception\ServerException $e) { //Consider not including all information back to client $response=$e->getResponse(); } return $response; } }
hi
i want to use proxy in php with curl for scraping contet .but some proxy not suport post request .
plz tell me how to chek before use proxy post request suported or not also want proxy speed in ms..
plz help me out .
thanks .
Edited by ShivaGupta, 23 May 2014 - 04:49 PM. |