{"id":207,"date":"2011-03-23T09:20:54","date_gmt":"2011-03-23T01:20:54","guid":{"rendered":"http:\/\/play.datalude.com\/blog\/?p=207"},"modified":"2011-03-23T09:24:39","modified_gmt":"2011-03-23T01:24:39","slug":"bandwidth-saving-apt-get-upgrade","status":"publish","type":"post","link":"https:\/\/play.datalude.com\/blog\/2011\/03\/bandwidth-saving-apt-get-upgrade\/","title":{"rendered":"Bandwidth Saving apt-get upgrade"},"content":{"rendered":"<p>I\u00a0 have a main desktop on my home LAN, and a few notebooks, all running Ubuntu. I have a pretty slow Internet connection, so when a kernel update comes out it means running a 50Mb update on all of the machines. It struck me that this isn't the most efficient way of doing things. I experimented with the apt-cacher package, but that had two problems: first it didn't seem to work that well and often crashed on the main desktop; second, whenever I went outside my home LAN it didn't work.<\/p>\n<p>So I did the Linux thing, and made a quick and dirty script that works for me &#8230;<\/p>\n<p><!--more--><\/p>\n<p>First of all I tried to figure out how to mount a remote directory over the network and then add a line to apt's sources-list file to include that directory. I didn't get too far with that. Then I figured I could use rsync over ssh to update the local cache directory and then run apt-get after that.<\/p>\n<p>OK, so my main machine is called \"desktop\". That's where I'll run all the normal 'apt-get update's and Update Manager. So I need to sync all those changes to \"laptop1\", \"laptop2\" etc.<\/p>\n<p>The first step isn't essential but it makes things easier: setting up password-less login to the main machine.<\/p>\n<p>On the main machine install openssh-server if you don't already have it.<\/p>\n<pre>sudo apt-get install openssh-server<\/pre>\n<p>On each of the laptop clients, set up a certificate and copy it to the main machine.<\/p>\n<pre>ssh-keygen\r\nssh-copy-id desktop\r\n<\/pre>\n<p>More detailed instructions can be found by searching for ssh-keygen on the Internet.<\/p>\n<p>So assuming you can now login to the main machine with no password, you can use the following script to sync the \/var\/cache\/apt\/archive directory (which is where all the downloaded packages are stored).<\/p>\n<pre>#!\/bin\/bash\r\n# Script to log into main desktop, sync the deb package archive and run an update.\r\n\r\n# Sync\r\nsudo rsync -avz -e \"ssh -i \/home\/laptopusername\/.ssh\/id_rsa\" desktopusername@desktop:\/var\/cache\/apt\/archives\/ \/var\/cache\/apt\/archives\/\r\n\r\n# Upgrade\r\nsudo apt-get update\r\nsudo apt-get -y dist-upgrade\r\n<\/pre>\n<p>So when I'm on my home LAN, I can use this script, and when I'm elsewhere, I can just use the normal apt-get \/ Update Manager method. If you have other ssh options, eg non-standard port numbers etc, be sure to add them to the ssh command between the two double-quotes.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>I\u00a0 have a main desktop on my home LAN, and a few notebooks, all running Ubuntu. I have a pretty slow Internet connection, so when a kernel update comes out it means running a 50Mb update on all of the machines. It struck me that this isn't the most efficient way of doing things. I &#8230; <a title=\"Bandwidth Saving apt-get upgrade\" class=\"read-more\" href=\"https:\/\/play.datalude.com\/blog\/2011\/03\/bandwidth-saving-apt-get-upgrade\/\" aria-label=\"Read more about Bandwidth Saving apt-get upgrade\">Read more<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_crdt_document":"","footnotes":""},"categories":[1,4],"tags":[],"class_list":["post-207","post","type-post","status-publish","format-standard","hentry","category-it","category-linux"],"_links":{"self":[{"href":"https:\/\/play.datalude.com\/blog\/wp-json\/wp\/v2\/posts\/207","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/play.datalude.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/play.datalude.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/play.datalude.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/play.datalude.com\/blog\/wp-json\/wp\/v2\/comments?post=207"}],"version-history":[{"count":0,"href":"https:\/\/play.datalude.com\/blog\/wp-json\/wp\/v2\/posts\/207\/revisions"}],"wp:attachment":[{"href":"https:\/\/play.datalude.com\/blog\/wp-json\/wp\/v2\/media?parent=207"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/play.datalude.com\/blog\/wp-json\/wp\/v2\/categories?post=207"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/play.datalude.com\/blog\/wp-json\/wp\/v2\/tags?post=207"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}