热门标签 | HotTags
当前位置:  开发笔记 > 编程语言 > 正文

如何将12GB数据传输到新服务器-HowcanItransfer12GBdatatonewserver

Iminthemiddleofchangingourserver.Weboughtserverspaceatadifferentlocationandneedto

I'm in the middle of changing our server. We bought server space at a different location and need to transfer all the content from our current server to the new one. I need to move 12GB over to the new server. What would be the most convenient, fastest, and easier way to transfer all that data to the new server... Doing a "physical" transfer is not an option.

我正在改变我们的服务器。我们在不同的位置购买了服务器空间,需要将当前服务器中的所有内容传输到新服务器。我需要将12GB移动到新服务器上。将所有数据传输到新服务器的最方便,最快捷,最简单的方法是什么......进行“物理”传输不是一种选择。

I know is going to be a loooong procedure. I'm testing with some data and is taking a long long time.

我知道这将是一个loooong程序。我正在测试一些数据并且需要很长时间。

I'm actually hosting about 10 websites on our local server so, I would like to move all that data to the new server... I do a backup of all the important files: /var/www, /home/, /var/lib/mysql, and others with tar -czf and it results into a 12GB tarball.

我实际上在我们的本地服务器上托管了大约10个网站,所以,我想将所有数据移动到新服务器......我对所有重要文件进行备份:/ var / www,/ home /,/ var / lib / mysql,以及其他tar -czf,它导致12GB的tarball。

I don't have a big upload pipe so, I need to do a reliable transfer during off peak hours, 1am to 6am...

我没有大的上传管道,所以我需要在非高峰时段,凌晨1点到早上6点进行可靠的转移......

Any ideas?

4 个解决方案

#1


6  

rsync

rsync -av --progress localpath usrname@remote:remotepath

I transfer 300GB data last night by this cmd.

我昨晚通过这个cmd传输300GB数据。

Not need resume option, if interuppted, just run this cmd again. It will auto resume.

不需要恢复选项,如果中断,只需再次运行此cmd。它会自动恢复。

Actually, rsync is incremental.

实际上,rsync是增量的。

#2


2  

If you have space on your drive, create a zip or rar archive with all you need. Move the archive to the root of one of your web sites. Use a "multi-threaded" HTTP download accelerator to get the file on the new server. Such tools creates multiple connections to the server, and should therefore improve you download speed noticably.

如果您的驱动器上有空间,请创建一个zip或rar存档,其中包含您需要的所有内容。将存档移动到您的某个网站的根目录。使用“多线程”HTTP下载加速器获取新服务器上的文件。这些工具创建了与服务器的多个连接,因此应该显着提高您的下载速度。

Make several smaller archives with your archive tool to get you started faster and better protection against transfer errors.

使用存档工具制作几个较小的存档,以便更快地启动并更好地防止传输错误。

There are also FTP clients and servers with similar features, but I think the HTTP approach is easier.

还有FTP客户端和具有类似功能的服务器,但我认为HTTP方法更容易。

#3


0  

Since you're on a system with tar, you possibly also have split which can take your massive 12G file and turn it into a large number of smaller files.

由于您使用的是tar系统,因此您可能还需要拆分,这可以将大量的12G文件转换为大量较小的文件。

Then transfer those "reliably" with ftp (do md5sum or cksum at both ends to check all files have duplicated correctly) and put them back together at the destination.

然后使用ftp“可靠地”传输它们(在两端执行md5sum或cksum以检查所有文件是否已正确复制)并将它们放回到目的地。

With pipelines, you shouldn't have to worry about any extra storage space required above and beyond the tarred-up file.

使用管道,您不必担心在tarred-up文件之上和之外需要任何额外的存储空间。

#4


0  

Look into nc or netcat. You can set it up to listen on your "server" and then provide the specifics from the machine with the 12 GB file. If you're after security though, scp is another consideration but it will be far slower than nc (or so I remember reading).

查看nc或netcat。您可以将其设置为在“服务器”上进行侦听,然后使用12 GB文件从计算机中提供详细信息。如果你在安全之后,scp是另一个考虑因素,但它会比nc慢得多(或者我记得读过)。


推荐阅读
author-avatar
1234567山02
这个家伙很懒,什么也没留下!
PHP1.CN | 中国最专业的PHP中文社区 | DevBox开发工具箱 | json解析格式化 |PHP资讯 | PHP教程 | 数据库技术 | 服务器技术 | 前端开发技术 | PHP框架 | 开发工具 | 在线工具
Copyright © 1998 - 2020 PHP1.CN. All Rights Reserved | 京公网安备 11010802041100号 | 京ICP备19059560号-4 | PHP1.CN 第一PHP社区 版权所有