Years ago (not sure about current builds), I was in a similar situation
where my deployments had to remain offline and can highly recommend
apt-offline as an excellent solution to this problem. If I recall correctly
all that is required is a portable drive, python (any OS), and a reliable
connection you can go to occasionally for updates.

On Sat, May 6, 2017 at 7:01 AM, Michael . <keltoi...@gmail.com> wrote:

> Without some access to a repository you will not be able to obtain
> packages to install them. If they are available online as updates it is
> recommended that you install them, especially security updates, at the time
> they are obtained. Updates are updates for a reason, they either fill in a
> security flaw or they fix a bug or functionality.
>
> So you have 3 options
> 1. connect to the net.
> 2. obtain up to date discs each and every time an update is rolled out
> (highly impractical)
> 3. use something like apt-offline.
>
> Cheers
>
> On 6 May 2017 at 21:31, Albretch Mueller <lbrt...@gmail.com> wrote:
>
>>  For more than one good reason (among them an unreliable Internet
>> connection at times or simply not wanting to go online)
>>
>>  I would like to run apt-get locally (or be able to functionally do
>> the same using dpkg or whatever). This is what I have in mind:
>>
>>  1) use apt-get in simulate mode to know which files I need to install
>> and in what order
>>
>>  2) fetch those files and keep them locally
>>
>>  3) install them locally whenever I need to
>>
>>  Most (all?) people simply go "sudo apt-get" under the assumption that
>> the back end repositories will be fine etc.
>>
>>  Yes, I am trying to install stuff when I need it without having to
>> connect to the Internet
>>
>>  How do you do this? What would be the pros and cons of doing things this
>> way?
>>
>>  lbrtchx
>>
>>
>

Reply via email to