PHP - Websites That Know Your Physical Location
How is it possible for a website to know my physical location?
Some websites I've seen will have ads such as; meet singles in "HOMETOWNGOESHERE" I'm guessing this kind of info is found from my Ip address? Has anyone ever done anything like this in php, I'm interested to know how this works. Similar TutorialsOkay so I am scraping websites for their descriptions keywords and titles. I noticed that a lot of websites use the same keywords and descriptions on every page.. so my idea is to scrape the index and find all the links in there and scrape them all then after they been scraped check all of the descriptions and if the descriptions match then pull some text unique to each page and use that. I can't seem to wrap my head around it.. how would I accomplish this? I scrape with curl then find keywords description and title then find all links on the site and scrape those. soo I was thinking making an array of the descriptions and then checking and inserting to the db but doesn't seem like it would work. Any ideas? Oh also.. how would I grab just text from each page that is different from every other page? lol very confusing does anyone have an example of new code that is used this day in age for combining js and PHP to produce dynamic content on a webpage? I have looked at this from w3schools: https://www.w3schools.com/php/php_ajax_database.asp but I believe that XMLHttpRequest is really old. If I rememeber my reading right, it was replaced by fetch() a long time ago. I think i understand everything that is on that tutorial page, but I would love to see a video tutorial on this subject. can anyone here point me to such a resource? Edited December 11, 2019 by ajetrumpetHere's a random example: macbundlebox.com/the-spring-bundle/ (I purposely excluded the http://www. so people won't think this is spam) Notice how there are 11 apps listed? So suppose someone buys the package, I'm assuming they get emailed a unique license/activation code to activate that particular software? Or they get a link to download the software? Or they email each customer a unique username/password to download that software. I am not a PHP guru but I am almost certain that this is done in php and works in one of 3 ways: - Either the app developers give a giant list of licenses that are assigned by the webmaster per purchase - A URL is pinged after each purchase which sends a license back and is directly emailed to the customer - A URL is pinged and the app developers send the license to the customer So my question is...seeing as how most of you are very knowledgeable in PHP, is there any client management/billing software that you know of that could accomplish the above, and be easily managed by a non-PHP guru (perhaps even be integrated into a CMS like Wordpress?)? If not, how much would it cost (give or take) to have it built? Is there a way I can scan one of my websites for directories and files if it is not on the same server? I can scan just fine on the same server but not if it is on another server/website. since it is not the same server I tried to give the full path: http://www.site.com but it would not work. can this even be done? Hi everybody
I am one stressed person right now.
I had been away for a while only to return recently and find out that none of my websites (with Hostpapa) had been working.
I sent the technical support team an email and they told me that my websites were not PHP 5.4 compatible.
They also said:
"As register_globals is deprecated in PHP 5.4 and because of this you are receiving errors on your website. You will have to update your website code and remove register global references. If this can not be done then in that case the last option would be to upgrade to VPS hosting."
Why have things just suddenly changed?
How can I fix this problem?
It's affected many of the sites on my host - and in some cases - the submission forms are not working in connection with my Aweber account.
Please help as this has shocked me quite badly - and needs resolving asap.
Thank you in advance
i have an array of website links i want to loop through each and get a certain html id foreach whats wrong with what i did here? Code: [Select] foreach($links as $page) { $phtml = file_get_html($page); // find all td tags with attribite align=center foreach($phtml->find('span[id=name]') as $name) echo $name->plaintext.'<br><br>'; } I got a good one for you guys, I run 20 or so websites on my server all that have very similar coding structures. On all sites, I have a php script for a mysql table that logs visitors and I filter out bots(google, yahoo, etc, as many as I know of), but since many new ones are discovered weekly/monthly, I hate manually going through all sites and adding the new bot to the array of existing bots for detection. I know there are a few ways to do this, but I'd like to know the most efficient way of basically doing one function/task/opertation and it automatically updating all website's to know the new bots without manually modifying all 20 php files every time. The sites are all on the same server, but at different domains. a few ideas, which are prob not the best 1) have a universal text file that I could update that would be located in some file on one of my sites that just has a list of all bots and then each site would read from that text file to know what bots to filter. 2) have a huge php file to execute that has all connection info/passwords for every site and runs a php script to update the corresponding website's mysql table that would hold each bot as a row in the table. the script would just add a new row with the new bot info. so then ultimately when logging visitors, it would have to read the database each time 3) try to "include" a universal .php file that's located on one of my sites that has the array of all bots. But I've tried to include a file from another domain and I know that's a security error and won't include the file as a .php, but rather the file after it's been processed. There's gotta be a better solution. any ideas? I have 2 websites currentweb.com copyweb.com (say) Using php can I update 2 databases and upload files in both these 2 websites ( for both these web sites I have user id and password exmaple I am uploading a file (Image), in currentweb.com can I upload it to copyweb.com at the same ( I have database and ftp password for this ) If Yes Please help with code I'm new in php and I don't have knowledge of php.
I just want to know, whenever we download file from server to Desktop through FTP, there are whitespaces occuring in files.
Like html websites, we always remove whitespaces whenever we upload the file to server after editing it. Is this step necessary for php websites also? If necessary then how to do it?
I have developed a form for my website that directs the user to a different site and per-populates the form on the other site with information I want. I would like to take it one step further and make the below script fill out the form on the other site and then submit the form. Currently it just fills out the form on the other site but does not submit it. Is there any way to do this? Code: [Select] <form action="http://www.privatedutyhomecare.org/sections/consumers/locator.php" method="POST"> <input name="state" size="25" type="hidden" value="MI" /> <input name="company" size="25" type="hidden" value="Affordable HomeCare" /> <button type="submit" class="imgpx"> <img src="assets/images/logos/npda.png" alt="submit" width="51" class="nothing" height="40" /> </button> </form> Code: [Select] if($login_count == 1) { while ($row = mysql_fetch_array($login_sql, MYSQL_ASSOC)) { $_SESSION['MembersID'] = $row['members_id']; session_register('MembersID'); }} this works fine and logs me in on my website with the script and displays the correct id with: Code: [Select] if(isset($_SESSION['MembersID'])){ $session_id = $_SESSION['MembersID']; echo'<li><a href="https://members.selmgec.co.uk/">Welcome, '.$session_id.'</a></li>'; } but how can i pass it onto another website i own, using the same database connection. is it possible to collect data from another website and insert it into my db?, lets say for example: http://www.imdb.com/title/tt0285331/episodes#season-1 could i somehow get the Episode name eg: Quote Episode 1: 12:00 a.m.-1:00 a.m. and the description Quote Jack Bauer is called to his office because there's a threat on the life of a US Senator who's running for President; Jack also discovers that his daughter has skipped out her bedroom window. and place that into a table in my db? any help would be great. HI guys Im developing Health site using PHP and mysql. I want to create graphs from using data in the database on weekly basis. so please tell me what is the best method of creating graphs and charts in php thank in advance Hello, I had made a website (PHP) for a music company few years back. They basically sell songs from their site. Now they've faced a problem. Their site has been crawled by abmp3.com and they not only let user download songs from our site but also give a full path of a song to download. I wonder how this happened and how to stop them from crawling our site. And its likely some other site doing same too. So please anybody help me how to overcome this problem, may be some PHP code can do this??. Thanks watsmyname I know that it is possible to see what is sent by looking at livehttpheaders, but how do you know how hidden variables are generated? I don't think that that's possible to see that from live httpheaders. Hi guys,
What I'm trying to accomplish sounds like fairly easy task but due to my poor knowledge of php this turned to be quite a challenge.
What I'm trying to do is to make a php script that will search the keywords on the multiple websites.
Websites that I will search are all web shops, selling spare part for home appliances and keywords used are usually original spare part codes.
When searching, script is searching these websites using their own search functions and not Google or other search engines.
Input for this script should be a CSV file containing list of keywords and URLs of the web shops that needs to be search for all these keywords. Here is the example: http://prntscr.com/4ebhxh Script should perform like this: It picks up the 1st keyword, browse to the URL1, uses its search, searches for the product, if it finds it, copy its price and write it back to original input CSV. If it doesn't find match (search results appear empty) it should write "no match found" and continue to URL2, URL3 and so on... When all URLs from the list are checked for the 1st keyword, scripts picks up 2nd keyword and continues on through all these keywords are not checked. This would be a resulting CSV file after the 1st keyword is checked: http://prntscr.com/4ebj52 After all data from the input CSV file are processed, script should prompt a msg and create a download link for that CSV file to be downloaded. If there are multiple matches, in other words if for one keyword some of the website searches find 2 or more products, something like "More then one match" should be written in the file. Example: http://prntscr.com/4ebkcx Please note that non of the website is using SSL and non of them requires login in order to display the prizes. This fact should make this script easier to build. Its not important for this script to run fast (its better I think to run it with some timeouts because of the server glitches and bottlenecks). What is more important is to make it automatic so one can start it over the night, over the weekends. Number of the URLs would be around 10, and list of keywords from few tens, to a few hundred. If I can provide some additional clarification and info I'm available. Of course I would be willing to pay someone to help me accomplish this task. Cheers Dean Edited by dolke022, 19 August 2014 - 07:54 AM. I have created 5 websites under 5 different domains. Contents of this websites' are similar and using a same template for each one. Now I need to create an admin panel to control these websites. PHP and MySql I will use for this backend.
My problem is how can I manage these five website with one backend? Reason is I will use different domain for my backend. My all 5 client will use this same backend system to manage their own website.
So can I know from the professionals here, is it possible to display mysql data on these 5 website. If it is possible then how?
Any links to article or tutorials would be welcome and appreciated.
NOTE: I checked
mysql federated storage enginebut no idea? Is it the way where do I need to go? Thank you. I need to know if this is even possible. I have found conflicting information on this online. Anyway, here is what I want to do. I have two websites that reside on the same dedicated server. Both have different IP's. www.mywebsite1.com and www.mywebsite2.com. I need a user that logs in at www.mywebsite1.com to be able to pass the $_SESSION from www.mywebsite1.com to www.mywebsite2.com. Any help on this would be cool. Thanks in advance. Ryan This topic has been moved to CSS Help. http://www.phpfreaks.com/forums/index.php?topic=306024.0 Hello!
I have five websites on a single server, and each website uses the same geographic data, for instance US zip codes and city/state latitudes and longitudes. Each website uses it's own copy of these "large" tables.
I would like to avoid having to maintain the same data in 5 databases, but I am more concerned with database performance, e.g., the speed at which the data is retrieved for each website.
Question: Should I set up a new separate database that contains the shared tables for all websites and allow each website to connect to this new database for shared data? Or is it better for each website have it's own copy of the duplicate tables?
Thanks for any info!
Cheers
|