PHP - Some Sort Of Auto-updater / Sync Files Server-to-server?
Hi folks,
I'm curious if I can for example, save a file from my server and it will save to all other servers - obviously if they accepted the connection first. It's for a software I developed and is almost complete and know there will be frequent updates to it. Instead of users downloading upates, I want the update files from my server to somehow synchronize to their server automatically? Anything called this?? Thanks for info. Similar TutorialsWhat I'm trying to do is copy all files from one server to another folder on another server. Here is what I have have so far.. <?PHP //connection settings $ftp_server = "server"; $ftp_user_name = "user"; $ftp_user_pass = "pass"; $dir = "/var/test/"; $destination_file = "/test/"; // set up basic connection $conn_id = ftp_connect($ftp_server); // login with username and password $login_result = ftp_login($conn_id, $ftp_user_name, $ftp_user_pass); // check connection if ((!$conn_id) || (!$login_result)) { echo "FTP connection has failed!"; echo "Attempted to connect to $ftp_server for user $ftp_user_name"; exit; } else { echo "Connected to $ftp_server, for user $ftp_user_name"; } if (ftp_chdir($conn_id, $dir)) { echo " <br/>Current directory is now: " . ftp_pwd($conn_id) . "\n<p/>"; } else { echo "Couldn't change directory\n<p/>"; } $buff = ftp_rawlist($conn_id, $dir); foreach($buff as $files) { echo $files. "<br/>"; if (ftp_get($conn_id, $destination_file."test.file", $dir."test.txt", FTP_BINARY)) { echo "<br/>Successfully written to $destination_file\n"; } else { echo "There was a problem\n"; } ?> That doesn't work. Any ideas? Thanks, Sean Hey All, First post so please be gentle! I have a website www.antiquesattic.co.uk and i have been left in the lurch and without it being fully finished. I want to add google analytics code to everypage and i am a bit stuck. I have access to the server (apache 2.2.19) and have been able to add site maps etc but i am not sure where to paste the analytics code. Which folder would i look in and whereabouts do i paste. I would be extremely extremely grateful if someone could do me a quick step by step guide Also, if you would like to have a look at the website and give me some feedback that would be great! One final thing, i can't add content to the homepage just auctions, would it be a easy job to get some one to edit the site so i can add say 1000 words of text at the bottom of the homepage from my admin area? Apologies for the noob questions but my expertise is in sales and marketing not coding. Thanks Chris I have a php script that I wrote that enters information into a database and uploads images to the server. What I need to do now is delete the files from the server when the corresponding database records are deleted. How would I go about doing that? Just thinking last night trying to speed some things up, how hard is it to have a text file on the server that can only be edited by PHP? Or is there a way of making a folder unreadable to anyone?, this way I could have all the text files in the non-readable (non-viewable) folder so no user could edit them? James Code: [Select] <?php exec('java -jar/MicroChatServer.jar',$output); print_r($output); ?> Hello frds......... I have chat prg. jar file and I have to integrate this chat in a PHP website uses Wampserver.... I try above code but it display "Array()" instead of execute jar file. What should I do???? Any idea?????????????// EDIT. When starting this post, I thought it was causing the browser to make an extra request to the server. I've since found this wasn't the case, however, didn't change the title of this post, and can't seem to change it to something like "Critique of file caching script"
I am trying to cache a file, and put together the following script. I put the following in the browser:
https://test.sites.example.com/administrator/index.php?cid=2&controller=sell&id=643341356... and Apache will rewrite as: https://test.sites.example.com/index.php?admin=administrator&cid=2&controller=sell&id=643341356index.php includes the following line: <script src="/lib/js/clientConstants.php?b=1" type="text/javascript"> I'm trying to find a way to back up an entire servers file through PHP. I have a script that can take selected files into a zip file but I'm not sure how to make sure that when I loop through directory and files that I get every single file. I was thinking of just foreach(glob('*') as $file up to 10 times but I'm hoping there's something more definite then that method... Thanks I set up functionality for a client that got moved to their server after I wrote and tested the functionality. It does not work on their server and I am trying to trouble shoot the problem as I do not have ftp access to the server.
I believe that I have narrowed it down to either I do not know how to access the temp files on their server, Windows IIS, OR the security on the server is not allowing me to upload to the temp files.
Here is the form used to upload to the server.
<form enctype="multipart/form-data" action="#" method="post" > <input type="hidden" name="upload_csv" value="TRUE" /> <p><label for="userfile">Upload CSV:</label> <input type="hidden" name="MAX_FILE_SIZE" value="45000000" /> <input type="file" id="userfile" name="userfile" title="userfile" value="" /></p> <p><input type="submit" name="upload_csv" value="Upload CSV" /></p> </form> Here is the test code: <?php $csvfile = $_FILES['userfile']['tmp_name']; $size = $_FILES['userfile']['size']; if($_POST['upload_csv']){ //---------------------------------validation code here if($problem){ echo '<div class="error">Did not pass validation</div>'; include("upload_csv.inc.php"); }else{ if (($handle = fopen($csvfile, "r")) !== FALSE) { //--------------------------------------------------------------this is where the code that imports the data from the csv and to the database goes echo '<div class="message"><p>CSV has been opened!</p></div>'; }elseif(!$csvfile){ echo '<div class="error">Problem #1</div>'; }else{ echo '<div class="error">Problem #2</div>'; } } }else{ include("upload_csv.inc.php"); } ?> This code generated the error message: Problem #2. One more bit of information: the permissions on the server is set so that nothing can be uploaded by an external script. I do not know much about servers, but it seems to me that because the security is so tight on the server, the security is the reason that this does not work...??? I appreciate help with this. I just am not knowledgeable on how the $_FILES['userfile']['tmp_name'] code actually works? I have a PHP web system that store in a windows server. In the system, there is a function for user to upload files to another server (Shared server in Unix). When i try to upload a file, it gives warning: Warning: move_uploaded_file(\\unixserver/sharedfolder/upload/test.txt) [function.move-uploaded-file]: failed to open stream: Permission denied in C:\wamp\www\upload\index.php on line 40 For your information, my username has been assigned in xxx's group that has access to read and write on that folder. Besides, i'm able to open,create and delete files on that folder's server manually (samba). The safe mode setting is off. Does anybody has any idea why this thing happen? Hi, I am not a PHP programmer. I took on a new client with a simple PHP site, without any databases. The site is up and running on the web. I would like to get it running on my local machine for further development. I have latest version of WAMP installed, running Apache version 2.2.11 and PHP version 5.3.0 I created a directory in the WAMP "www" project directory and it shows up there like it's supposed to when I browse to "localhost" Problem: The home page of website displays text but no, images, styles, footer, header, nav links, etc. Here is the code for the home page: <? define("NAV","home"); require_once('local/local.php'); ?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title>TITLE</title> <meta name="keywords" content=""> <meta name="Description" content=""> <? include("common/dochead.php"); ?> </head> <body onLoad="<? include('common/preloads.php'); ?>"> <!-- ============================ main ============================= --> <div id="main-frame"><div id="main" class="noCollapse"> <? include("common/sign.php"); ?> <div id="right-frame"> <? include("common/navigation.php"); ?> <div id="content-frame"> <div id="content"> <h1>Welcome</h1> <p>This is the content area. This is the content area. This is the content area. </p> </div><!-- end content --> </div><!-- end content-frame --> </div><!-- end right-frame --> <div class="clearFloats"></div> </div><!-- end main --></div><!-- end main-frame --> <? include("common/footer.php"); ?> </body> </html> Any help would be greatly appreciated. I have spent many hours on this. Regards I'm trying to make a simple website where people register to my website. When the user doesn't fill anything inside the boxes they get a message "Please fill all required fields" on the register.php page On my local host require_once works good. It shows up.
But when i upload the files to my sever the require_once does not show up on the register.php It just refreshes and i dont get the message "Please fill all required fields"
This is the code that works in local host but not in a live server <?php require_once 'messages.php'; ?>
Here is my full code
Register page: <html> <?php require_once 'messages.php'; ?> <br><br> <form action="register-clicked.php" method="POST"> Username:<br> <input type="text" name="usernamebox" placeholder="Enter Username Here"> <br><br> Email:<br> <input type="text" name="emailbox" placeholder="Enter email here"> <br><br> Password:<br> <input type="password" name="passwordbox" placeholder="Enter password here"> <br><br> Confirm Password:<br> <input type="password" name="passwordconfirmbox" placeholder="Re-enter password here"> <br><br> <input type="submit" name="submitbox" value="Press to submit"> <br><br> </form> </html>
Register clicked <?php session_start(); $data = $_POST; if( empty($data['usernamebox']) || empty($data['emailbox']) || empty($data['passwordbox']) || empty($data['passwordconfirmbox'])) { $_SESSION['messages'][] = 'Please fill all required fields'; header('Location: register.php'); exit; } if ($data['passwordbox'] !== $data['passwordconfirmbox']) { $_SESSION['messages'][] = 'Passwords do not match'; header('Location: register.php'); exit; } $dsn = 'mysql:dbname=mydatabase;host=localhost'; $dbUser='myuser'; $dbPassword= 'password'; try{ $connection = new PDO($dsn, $dbUser, $dbPassword); } catch (PDOException $exception){ $_SESSION['messages'][] = 'Connection failed: ' . $exception->getMessage(); header('Location: register.php'); exit; }
messages.php <?php session_start(); if (empty($_SESSION['messages'])){ return; } $messages = $_SESSION['messages']; unset($_SESSION['messages']); ?> <ul> <?php foreach ($messages as $message): ?> <li><?php echo $message; ?></li> <?php endforeach; ?> </ul> Edited Wednesday at 12:49 AM by bee65
i have php at server side and c++ at client side.what i am trying to do is to constantly look for files on server in folder on server if there’s a file in folder then download it on client side and delete it from server folder i need to count the number of image files on a remote server not in my network. ive come across a suggestion to use
@getimagesize($img_url)the above works but is really slow. anyone able to suggest a better method? Hi, ive recently created a gallery website and im happy with the way everything currently works. However the main drawback is the site uploads using a html webfom which is great for remote users or the odd image. However, as i want to mass upload my existing collection i will need the ability to read a selected folder and then to carry out all the same processes that existed from the existing html form upload. Im also using gdlibrary and checking file types to ensure they are within my allowed list, but im wondering if there are any other common security alerts i should be aware of to keep things a little bit safer if/when i publish outside of my LAN. So in a nut shell i need some assistance with changing my upload process to work for more than one file at a time, ideally by reading a folder, or at least reading X amount of files at a time - processing them then moving onto next batch of files in the list. Then the next part i need help with is checking/improving basic security of the system We are using a FileZilla server for our upload ftp server and we encountered a problem in uploading files. When we upload files 1KB - 1.5MB it will be a success but when we upload a file higher than that an error says: "Filename cannot be empty in C:\wamp\www\gankgame\myaccount.php on line 53" Furthermore, when we upload a file much greater than that like 30MB+ it will not upload but has no errors. Is there any way to configure the upload maximum size? or to fix this errors ? Help us please Our upload code is uploaded . please see the attachment Following a tutorial on udemy, i tried to learn the very basics of mvc structure. I built the same project on my local server and it worked without giving me any error. but when i tried it on live server. its not working as it should. not showing any error. I tried to figure out the problem and found that for every page loading, it stops at the same line in my main.php file. <?php require($view); ?> starting from the above line. it stops. i came here to share my problem but i am unable to upload my files here. if there is a way to upload and share my files, please guide. zip file size of the whole project is 31.6 kb This isn't exactly an application design question, but rather a system design one.
I am about to install an Inventory Control System inside this store I work in.
The store itself also owns a Linode VPS running Centos 6.4 which hosts our website.
This new Inventory System will come built in with a Microsoft SQL Server, and supposedly it is a SQL Anywhere database, but I'm not too sure what that means.
I need to make this database publicly accessible, but only via the Linode VPS. Surely, setting restrictions is easy enough to address that issue. That isn't my question.
My first idea is to put this server into the DMZ, easy. But it doesn't exactly sound safe. So my next idea was to put a middleman server in the DMZ, this way the Linode can send queries to that middleman server and it will send that data to the SQL Server and back. This is very vaguely described I know, but I don't want to get too much into details, but rather, understand how I can create that middleman server, and what could Install onto it that would allow me to securely process queries?
My first thought was to install a webservice, that accepts an XML/JSON request and returns an XML/JSON response.
Then, I realized directly afterwards that I don't have any experience setting up a webservice like that.
What kind of options are there out there? Ultimately, my question is, should I just put the Server in the DMZ or should I create the middleman, and if so, can someone point me in the right direction as to getting a webservice set up? Edited by Zane, 15 July 2014 - 11:28 PM. I am developing a CMS for my clients where the login to administrate their website is through my own website. This unified back-end allows me to develop the functionality and user experience without the issue of updating each site manually with the new files or running a complicated macro etc and then uploading them all individually too. I would also like to combine as much of the "client side" CMS code too, i.e. as many of the php files that are stored on the clients website as possible. All of the websites (and the backend website) are located on the same VPS, running PHP V5.3, allow_url_include is not available, I could have it turned on but this causes security issues if I understand correctly? I figure my main options a - Turn on allow_url_include and just include via IP/Domain. - Use a FTP system similar to the way Wordpress allows remote install of plugins and system updates. or - Include the files using a different /home/username/cmsincludes/ path? Is it possible to use another VPS account to include from? I have tested and it gives me a "failed to open stream" error. Example: Client xyz Website Website located: /home/xyz/public_html/ CMS Included: /home/mainsite/cms/ Client 123 Website Website located: /home/123/public_html/ CMS Included: /home/mainsite/cms/ Client xxx Website Website located: /home/xxx/public_html/ CMS Included: /home/mainsite/cms/ My server is Linux/Apache/PHP.
When a file is uploaded, I use PHP's finfo_open to confirm that the file have the correct file extension matches and delete them if it doesn't match. I also which file mimi types and size could be uploaded.
Things I do with the files include:
Upload user's files and store them in some public directory (/var/www/html/users_public_directory/), and allow other users to directly download them.
Upload user's files and store them in some private directory (/var/www/users_private_directory/), and allow other users to download them using X-Sendfile.
Upload user's ZIP files and convert them to PDF files (unzip the ZIP file, and uses Libreoffice and Imagemagick's convert to convert them to PDFs).
From the server's prospective, what are the risks of allowing users to upload files? Are there some file types which are more dangerous to the server? Could they be executed on the server, and if so, how could this be prevented?
|