Re: systemd-fstab-generator failed to create mount

2017-05-07 Thread Dominique Dumont
On Friday, 5 May 2017 18:32:09 CEST Charles Kroeger wrote:
> the whole error message is: systemd-fstab-generator failed to create mount
> unit five /run/systemd/generator/-.mount   as it already exit possible
> duplicate entry in /etc/fstab?

I believe this error message is correct. All HDDs in your fstab files are 
mounted on '/'. 

The 3 disks should be mounted on different points.

> if systemd-fstab-generator is now making its own fstab file, should we
> delete our old one?

Neither. From systemd-fstab-generator man page: "systemd-fstab-generator is a 
generator that translates /etc/fstab (see fstab(5) for details) into native 
systemd units"

You must fix you fstab file. 

All the best

-- 
 https://github.com/dod38fr/   -o- http://search.cpan.org/~ddumont/
http://ddumont.wordpress.com/  -o-   irc: dod at irc.debian.org



Re: need a tutorial on setuid

2017-05-07 Thread tomas
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On Sat, May 06, 2017 at 10:40:59PM -0500, Michael Milliman wrote:
> I can't see how dd could have been the culprit [...]

Definitely. The file system's inner structure isn't known to dd. One
possibility is that the subsequent mount is suppressing the suid bit
(a security feature: imagine I give you an USB stick with a suid
binary which you invoke without noticing...).

Check mount's suid option.

Cheers
- -- tomás
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.12 (GNU/Linux)

iEYEARECAAYFAlkO3V8ACgkQBcgs9XrR2kaClgCeNhEyP5uCVlTecd/62KUjujld
AAMAn00R6ZKHq0BH3KWjemcDdzgi/L5u
=WoU5
-END PGP SIGNATURE-



Re: Live Fille System Backup

2017-05-07 Thread tomas
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On Sun, May 07, 2017 at 10:53:51AM +1200, Ben Caradoc-Davies wrote:

[also a reply to Henrique, elsewhere in this thread]

> If a file is updated while it is being copied, it may contain only
> half a change set and be in an internally inconsistent state,
> perhaps making it unusable as a backup. Writes are typically not
> atomic. The same problem applies to collections of files that
> reference each other.
> 
> Kind regards,

Ben, Henrique -- no questions. This is what I subsumed under
"skew": application state may be dispersed across different
places in a file, across different files or even partly not
in files at all (e.g. in RAM: imagine a BTree with just parts
of its pointer structure not yet committed to disk).

Of course you can't ever win unless you collaborate with the
application in those cases (even with magic file systems like
ZFS or btrfs).

Then there is this subtle "file data" and "file metadata"
thing, which is an issue even with carefully designed applications
and file systems. It's even difficult to reach a consensus on
what is "right", remember the ext3/ext4 data loss episode[1]?

This is where shapshotting magic, be it built-in (zfs, btrfs)
or bolted-on (overlayfs, lvm) might help a bit: freeze a snapshot,
back up that (in the first case, the file systems provide a native
way to do that, in the second case, rsync is a pretty viable
way of doing things). I said "might help a bit" because the
ultimate consistency criterion is the application! A consistent
file system view might just be this truncated-to-zero file,
only the application "knows" at that point (e.g. by keeping
its data in an already unlinked file which is still open,
or somewhere in RAM, or...).

So your choices are

 - for the applications you really care about, look into
   what they are doing. Grown up apps will support you in
   that (I gave the PostgreSQL example above). Typically
   you can wrap the backup process in guards like ("keep
   your on-disk state consistent"[1]..."now you can relax").
   Note that to avoid races this structure is more or less
   necessary. The only real difference to the "magic
   snapshot" thing is that the latter happens very quickly.

 - for all the others... just relax.

Otherwise, "on line" backup is simply not an option.

cheers

[1] That doesn't mean necessarily frozen. PostgreSQL, for example,
   continues writing to the WAL, it just eats through its storage
   at a higher pace.

- -- tomás
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.12 (GNU/Linux)

iEYEARECAAYFAlkO7SgACgkQBcgs9XrR2kYf9ACeM2njgrSttOUPRk4D6fJqJtjQ
qmkAn38VbkKiOlADe+33teN8uzcbLa2C
=uKNk
-END PGP SIGNATURE-



Re: Console fonts SOLVED :-)

2017-05-07 Thread GiaThnYgeia
That is great news
I am now wondering instead of 7-10" screens if there is a tiny battery
powered projector out there that you can use with it, as my tired old
eyes find even my 21" tiring :)

Now debian can add udoo to their architecture list from 9 onwards.

2 thumbs up

Larry Dighera:
> I gave up on Jessie, and installed Stretch as you and others in the 
> debian-users mailing list advised. 
> 
> Installing Debian Stretch (testing) from this link: 
> 
>  on the Udoo X86 required placing the rtl8168g-2.fw driver in the root 
> directory of the USB ISO installation medium to support the Edimax Nano 
> 150Mbps Wireless 802.11b/g/n USB Adapter. 
> The driver was downloaded from this page:
>  https://packages.debian.org/source/stretch/firmware-nonfree
>   
> http://http.debian.net/debian/pool/non-free/f/firmware-nonfree/firmware-nonfree_20161130.orig.tar.xz
> 
> The specific 'rtl8168g-2.fw' driver required was extracted from the 
> compressed tar archive with 7-Zip.
> 
> Stretch runs great! 
> 

-- 
 "The most violent element in society is ignorance" rEG

"Who died and made you the superuser?"  Brooklinux

"keep rocking in the non-free world" Neilznotyoung



Re: Live Fille System Backup

2017-05-07 Thread rhkramer
On Sunday, May 07, 2017 05:47:20 AM to...@tuxteam.de wrote:



> Otherwise, "on line" backup is simply not an option.

Thanks!  I've been paying at least peripheral attention to this thread (as I 
do to many), and got to thinking / wondering about how to deal with backup in 
an "enterprise" type of setting (by that, I'm trying to imply an "application" 
that is distributed and run over multiple computers / servers to serve a large 
number of (possibly transient) users).

I guess the way that comes to mind, maybe somewhat implied by what you've 
written (or at least inferred by me ;-) would be to take down one or more of 
the servers at a time, make static backups of those, then restore those to 
service and repeat until all have been backed up, very likely repeating that 
continuously.

I'm sure there are some subtleties that I'm not even thinking about (not that 
I'd expect to be able to do so).

I guess I sort of wanted to write that down and see if anyone had any insights 
into possibly better ways.

Oh, ok, I guess (well, I know RAID isn't really for backup, it's for uptime as 
is often stated on this and other lists), but something like RAID where 
relevant data is written to more than one disk and / or storage "farm", one 
being the "in service" device (to serve data to the app in "real time") and 
one or more just recording data as backup.

Anyway, I'd be interested in any comments.

I was going to mark this OT, but then I re-read the Subject, and it still 
seems OT (on topic) ;-)



Re: Console fonts SOLVED :-)

2017-05-07 Thread rhkramer


On Sunday, May 07, 2017 06:23:00 AM GiaThnYgeia wrote:
> That is great news
> I am now wondering instead of 7-10" screens if there is a tiny battery
> powered projector out there that you can use with it, as my tired old
> eyes find even my 21" tiring :)

Well, I've never had one, but I have seen small portable projectors advertised 
at what seemed to be ridiculously low prices (and fairly low lumens)--I don't 
really remember lumens or prices, but I'm guessing that, because I thought 
they were ridiculously low they might have been around $50 or less and 400 
lumens or less (and maybe 640x480 or less).

At the time(s), I didn't pay any attention to whether they were battery 
operated or not, but a typical advertised "use case" was to take to a customer 
site to show a video to one or a small group of (potential) customers.

ATM, I don't plan to try googling for them, but, having said that, after I 
send this, I just might do that, as my curiosity is re-aroused.  If I come 
across something, soon or further in the future, I'll try to post it here.



Re: Console fonts SOLVED :-)

2017-05-07 Thread rhkramer
On Sunday, May 07, 2017 08:49:13 AM rhkra...@gmail.com wrote:
> ATM, I don't plan to try googling for them, but, having said that, after I
> send this, I just might do that, as my curiosity is re-aroused.  If I come
> across something, soon or further in the future, I'll try to post it here.

Oh, try googling and/or searching ebay for ["video projector" cellphone]--you 
will get plenty of links to explore.



Re: Console fonts SOLVED :-)

2017-05-07 Thread rhkramer
On Sunday, May 07, 2017 08:54:26 AM rhkra...@gmail.com wrote:
> On Sunday, May 07, 2017 08:49:13 AM rhkra...@gmail.com wrote:
> > ATM, I don't plan to try googling for them, but, having said that, after
> > I send this, I just might do that, as my curiosity is re-aroused.  If I
> > come across something, soon or further in the future, I'll try to post
> > it here.
> 
> Oh, try googling and/or searching ebay for ["video projector"
> cellphone]--you will get plenty of links to explore.

Oh, two things:

   * on ebay, I found a lot of things that I'll call "passive" projectors, 
which apparently consist of a cardboard box with a lens on the front (and 
maybe one or more mirrors)--designed to have a cellphone placed in the unit 
and then the image projected through the lens.  I would expect the image to be 
pretty dim.

   * here's a "non-passive" one that seems to have lots of inputs, can use 12 
VDC, etc.  (I've copied the specifications, below, and note that I picked this 
one randomly from ebay--it was one of the ones near the top of the search 
results):

60'' Multimedia Projector Home Cinema Theater HDMI VGA TV USB For PC Cellphone

http://www.ebay.com/itm/60-Multimedia-Projector-Home-Cinema-Theater-HDMI-VGA-
TV-USB-For-PC-Cellphone-/201423728262

I guess the fact that the native resolution is 640x480 but it can support 
1920x1080 means that the image is "interpolated" (right word?) down to 
640x480.  I suspect most of them work this way, but there may be exceptions.


Projection Technology   LCD
Native Resolution   VGA (640x480)
Supported Resolution1080P (1920x1080)
Input Voltage(V)100-240V,12V, 2000mA
Brightness(Lumens)  500
Brightness Range100 to 999 Lumens
Aspect Ratio4:3 and 16:9
Contrast Ratio  400:1
OSD Languages   Chinese, Russian, Portuguese, Italian, Spanish, German, 
French, English
Projection Screen Size (inch)   20-60 Inch
Projection Distance (m) 1.25-4M
Video Formats   3GP, MP4, H.264, MPEG, VOB, MOV, AVI
Audio Formats   ACC/ACC+, WAV, WMA, MP3
Picture Formats JPG, PNG, GIF, BMP, JPEG
Connectors  HDMI Input, SD Card Slot, USB, VGA Port, 3-in-1 AV In
Speakers Included   Yes, built-in




[SOLVED] attempt to read or write outside of disk 'hd0'

2017-05-07 Thread Joe
On Sat, 6 May 2017 18:12:03 +0200
Pascal Hambourg  wrote:

> Le 06/05/2017 à 17:19, Joe a écrit :
> >>
> >> However, the ls command I suggested may still be useful to check
> >> GRUB's idea of the sizes.  
> >
> > ls (hd0)
> > (hd0): Filesystem is unknown.
> >
> > ls (hd0,1)
> > (hd0,1): Filesystem is ext2. (after several seconds' pause)
> >
> > I'm only getting the grub rescue> prompt, not the grub> prompt.  
> 
> I expected that "ls" would be the same in normal and rescue mode. 
> Obviously I was wrong.
> 
> If the GRUB version on the internal disk is the same, you could boot 
> from it while the USB disk is connected, and if the BIOS exposes the
> USB disk even though it is not the boot disk (some BIOS do, others
> don't), then you could use the "ls" command in normal mode to check
> (hd1) and (hd1,1).

Yes, that works, and looks right:

Device hd1: No known filesystem detected - Sector size 512B - Total
size 117220824KiB

Partition hd1,1: Filesystem type ext* - (Last mod time, UUID)
Partition start at 1024KiB - Total size 10485760KiB

However, (still from the host machine's grub):

grub> ls (hd1,1)/boot
< reasonable listing >

grub> set root=(hd1,1)
grub> linux /boot/vmlinuz-4.6.0-1-686-pae root=/dev/sdb1

gets me: error: "attempt to read or write outside of disk 'hd1'."

This result, from a second grub installation, suggests a broken
filesystem, despite the partition being readable and writeable when
mounted. I had already copied /etc from the drive, and was trying one
last attempt to boot to obtain a dpkg --get-selections in preparation
for a reinstall. A quick fsck /dev/sdb1 from the host confirms this,
and a couple of fsck -y runs (what have I to lose?) gets it booting
again. So --get-selections saved, and now to try the long-delayed
dist-upgrade...

Thanks again.

-- 
Joe



Re: Live Fille System Backup

2017-05-07 Thread tomas
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On Sun, May 07, 2017 at 08:42:36AM -0400, rhkra...@gmail.com wrote:
> On Sunday, May 07, 2017 05:47:20 AM to...@tuxteam.de wrote:
> 
> 
> 
> > Otherwise, "on line" backup is simply not an option.

[...]

> I guess the way that comes to mind, maybe somewhat implied by what you've 
> written (or at least inferred by me ;-) would be to take down one or more of 
> the servers at a time, make static backups of those, then restore those to 
> service and repeat until all have been backed up, very likely repeating that 
> continuously.

"Taking down" as in "stopping all relevant applications" (shutting down
the whole thing would be a superset of that, of course ;)

[...]

> Oh, ok, I guess (well, I know RAID isn't really for backup, it's for uptime 
> as 
> is often stated on this and other lists), but something like RAID where 
> relevant data is written to more than one disk and / or storage "farm", one 
> being the "in service" device (to serve data to the app in "real time") and 
> one or more just recording data as backup.

This is a variant on the snapshot pattern. You take a snapshot of the
file system and back up that. You are guaranteed that the *file system*
backup is consistent (well, modulo bugs and glitches), but since the
applications don't have a clue what's going on, application state
might still be inconsistent, unless they go out of their way to keep
a consistent state on-disk all of the time (in the above scenario
of shutdown, we assumed that the application leaves a consistent
state after being shut down, which is a reasonable assumption, I'd
say).

That said, and as you can see in the PostgreSQL example, with a little
help of your application, the snapshot thing works (in the case of
PostgreSQL things are much more relaxed and you can usually go with
plain rsync, for example, without resorting to snapshot).

All in all I'm pretty happy with a plain, straight rsync-style backup
for my workstation. The probability of a busted backup is so low that
I fear more for my backup medium going sour.

On a high-churn machine with very valuable data things might look
differently, though...

regards
- -- tomás
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.12 (GNU/Linux)

iEYEARECAAYFAlkPNaIACgkQBcgs9XrR2kYGWgCfcR14ZU9wA9kKDampxUY7MCHb
IjUAnj0zjx2U5yFy7TIIO/rP5UO7E6QI
=bD6R
-END PGP SIGNATURE-



Download Manager

2017-05-07 Thread Ashok Inder
For past 7 years, Linux does not have a single descent Download Manager.
Its totally possible that none of you may have faced this issue ever but
its a somewhat major issue for me...

First with WGET, I'm not at all comfortable with cli, the other being
that even with effort when I try to use wget, the file most of the time
downloaded ends up a corrupt file. For eg: I remember back in 2010,
downloading a 5-6mb of a single file (mp3 to be precise), the wget ended
up downloading packet of total size with 11mb and obviously corrupt file
or yesterday downloading a office file of 90 mb were wget downloaded a
total of 105mb file and again a corrupt file. Downloading complex things
from internet via wget is not even my cup of tea. It so much difficult
at my level of technology understanding.

I had this issue in Ubuntu, Mint, Opensuse, Fedora and now on Debian as
well. I had used from mobile internet,  LAN to WiFi and issue remained
the same.

I even wonder why the hell I even face this problem. Given that most of
the web services I use in the background use wget somewhere. Even a
simple browsing on Windows system, the browser's first request is via
wget in the backend. The world runs on wget, my android runs wget
services for all downloading and uploading in the backend and infact I'm
confident that play store apps are updated via wget service only...then
why I face such a weird wget issue which is so successful elsewhere and
for all.

Most GUI download manager (which I call WGET frontend software's) while
eases my download hassle from cli, ends up with same set of problem as
wget. Majority of the time corrupt file are getting downloaded.  And its
not just corrupt download, I don't have much control over the file and
bandwidth that I'm downloading.

I ended up using Free Download Manager on windows to download most of my
file in the end.

Torrent is not an issue, it works even better than windows in Linux with
better security (encrypted and blocking bad ip range which is not that
easy to do in windows).

I'm a heavy downloader. I download a lot of file of miscellaneous
category and nature.

Browser download managers are bad, there resume option is the worst
thing were I have to end-up re-downloading the file and moreover it
blocks my ability to control bandwidth and for many other reason, I
totally dislike browser (with addon) download manager.

This is one area where I have totally failed with Linux.


A point to make, in previous days, Internet in India used to be quite
erratic behaved network, the speed was like a half AC wave cycle, rising
from 0 to X speed and then falling down and going back to 0. Windows and
its download manager used to easily work with such erratic internet, but
in linux it was not. Today its quite well managed internet service in
India the erratic behavior is under acceptable level but still wget
issue for me remains the same.

Regards,
Ashok Kumar



smime.p7s
Description: S/MIME Cryptographic Signature


Re: Download Manager

2017-05-07 Thread Thomas Schmitt
Hi,

Ashok Inder wrote:
> First with WGET, I'm not at all comfortable with cli, the other being
> that even with effort when I try to use wget, the file most of the time
> downloaded ends up a corrupt file.

wget works for me where a web browser would work too.
I quite often download ISOs with a few hundred MB or a few GB of size.


> I even wonder why the hell I even face this problem.

Maybe too much of negative thoughts towards a shell command line ?


> I don't have much control over the file and
> bandwidth that I'm downloading.

Did you try options --output-document and --limit-rate ?
(See shell command "man wget", use "/bandwidth" to search for the
 word "bandwidth". The second hit is in the explanation of --limit-rate.)


Have a nice day :)

Thomas



RE: Download Manager

2017-05-07 Thread Ashok Kumar
Hi,

> wget works for me where a web browser would work too.
As I stated that this issue may not be faced by many.

> Maybe too much of negative thoughts towards a shell command line ?
Not negative thoughts about shell, I'm just not comfortable. It's that I 
cannot memorise/recall so many codes/scripts all the time, I even run down 
track of remembering all the alias.

> Did you try options --output-document and --limit-rate ?
>(See shell command "man wget", use "/bandwidth" to search for the 
>word "bandwidth". The second hit is in the explanation of --limit-rate.)
This is not with respect to wget in shell, where I agree that multiple control 
to heart content is available but was in reference to GUI frontend of wget i.e 
the download managers in linux. The control over the file being download is 
somewhat limited compared to over wget itself.

Regards,
Ashok Kumar

-Original Message-
From: Thomas Schmitt [mailto:scdbac...@gmx.net]
Sent: 07 May 2017 11:59 PM
To: debian-user@lists.debian.org
Subject: Re: Download Manager

Hi,

Ashok Inder wrote:
> First with WGET, I'm not at all comfortable with cli, the other being
> that even with effort when I try to use wget, the file most of the
> time downloaded ends up a corrupt file.

wget works for me where a web browser would work too.
I quite often download ISOs with a few hundred MB or a few GB of size.


> I even wonder why the hell I even face this problem.

Maybe too much of negative thoughts towards a shell command line ?


> I don't have much control over the file and bandwidth that I'm
> downloading.

Did you try options --output-document and --limit-rate ?
(See shell command "man wget", use "/bandwidth" to search for the 
word "bandwidth". The second hit is in the explanation of --limit-rate.)


Have a nice day :)

Thomas



smime.p7s
Description: S/MIME cryptographic signature


Re: Download Manager

2017-05-07 Thread Brian
On Sun 07 May 2017 at 23:37:53 +0530, Ashok Inder wrote:

> For past 7 years, Linux does not have a single descent Download Manager.
> Its totally possible that none of you may have faced this issue ever but
> its a somewhat major issue for me...

It is indeed totally possible not to have experienced the problems you
have met. Issuing challenges will only get you on a hiding to nothing,
 
> First with WGET, I'm not at all comfortable with cli, the other being
> that even with effort when I try to use wget, the file most of the time
> downloaded ends up a corrupt file. For eg: I remember back in 2010,
> downloading a 5-6mb of a single file (mp3 to be precise), the wget ended
> up downloading packet of total size with 11mb and obviously corrupt file
> or yesterday downloading a office file of 90 mb were wget downloaded a
> total of 105mb file and again a corrupt file. Downloading complex things
> from internet via wget is not even my cup of tea. It so much difficult
> at my level of technology understanding.

2010, eh? It was a bad year for not getting uncorrupted files.
Something to do with solar flares.
> 
> I had this issue in Ubuntu, Mint, Opensuse, Fedora and now on Debian as
> well. I had used from mobile internet,  LAN to WiFi and issue remained
> the same.

A plethora of problems after trying so hard.

> I even wonder why the hell I even face this problem. Given that most of
> the web services I use in the background use wget somewhere. Even a
> simple browsing on Windows system, the browser's first request is via
> wget in the backend. The world runs on wget, my android runs wget
> services for all downloading and uploading in the backend and infact I'm
> confident that play store apps are updated via wget service only...then
> why I face such a weird wget issue which is so successful elsewhere and
> for all.

That is the essential question. An example, rather than generalistions,
from you would help. Just one; something that could be tested.
 
> Most GUI download manager (which I call WGET frontend software's) while
> eases my download hassle from cli, ends up with same set of problem as
> wget. Majority of the time corrupt file are getting downloaded.  And its
> not just corrupt download, I don't have much control over the file and
> bandwidth that I'm downloading.

It's a hard life.

> I ended up using Free Download Manager on windows to download most of my
> file in the end.

Stick to it.

> Torrent is not an issue, it works even better than windows in Linux with
> better security (encrypted and blocking bad ip range which is not that
> easy to do in windows).
> 
> I'm a heavy downloader. I download a lot of file of miscellaneous
> category and nature.

Wink, wink.

> Browser download managers are bad, there resume option is the worst
> thing were I have to end-up re-downloading the file and moreover it
> blocks my ability to control bandwidth and for many other reason, I
> totally dislike browser (with addon) download manager.
> 
> This is one area where I have totally failed with Linux.

So it seems. You are alone.

> A point to make, in previous days, Internet in India used to be quite
> erratic behaved network, the speed was like a half AC wave cycle, rising
> from 0 to X speed and then falling down and going back to 0. Windows and
> its download manager used to easily work with such erratic internet, but
> in linux it was not. Today its quite well managed internet service in
> India the erratic behavior is under acceptable level but still wget
> issue for me remains the same.

Thank you for the history.



Re: Download Manager

2017-05-07 Thread Thomas Schmitt
Hi,

Ashok Kumar wrote:
> This is not with respect to wget in shell, where I agree that multiple
> control to heart content is available but was in reference to GUI frontend
> of wget

Then consider to create one or more shell scripts for your various
use cases and to trigger them by desktop icons.
If this is not enough, consider to use some GUI builder kit and create
a graphical frontend to wget (or your scripts) that fulfills your wishes.

Whatever, problems with the correctness of wget's downloads should be
investigated independently of frontend programs or scripts.
Especially it would have to be tested whether other download facilities
on the same system yield better results.


Have a nice day :)

Thomas



Re: Download Manager

2017-05-07 Thread Charlie Kravetz
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256

On Sun, 7 May 2017 23:37:53 +0530
Ashok Inder  wrote:

>For past 7 years, Linux does not have a single descent Download Manager.
>Its totally possible that none of you may have faced this issue ever but
>its a somewhat major issue for me...
>
>First with WGET, I'm not at all comfortable with cli, the other being
>that even with effort when I try to use wget, the file most of the time
>downloaded ends up a corrupt file. For eg: I remember back in 2010,
>downloading a 5-6mb of a single file (mp3 to be precise), the wget ended
>up downloading packet of total size with 11mb and obviously corrupt file
>or yesterday downloading a office file of 90 mb were wget downloaded a
>total of 105mb file and again a corrupt file. Downloading complex things
>from internet via wget is not even my cup of tea. It so much difficult
>at my level of technology understanding.
>
>I had this issue in Ubuntu, Mint, Opensuse, Fedora and now on Debian as
>well. I had used from mobile internet,  LAN to WiFi and issue remained
>the same.
>
>I even wonder why the hell I even face this problem. Given that most of
>the web services I use in the background use wget somewhere. Even a
>simple browsing on Windows system, the browser's first request is via
>wget in the backend. The world runs on wget, my android runs wget
>services for all downloading and uploading in the backend and infact I'm
>confident that play store apps are updated via wget service only...then
>why I face such a weird wget issue which is so successful elsewhere and
>for all.
>
>Most GUI download manager (which I call WGET frontend software's) while
>eases my download hassle from cli, ends up with same set of problem as
>wget. Majority of the time corrupt file are getting downloaded.  And its
>not just corrupt download, I don't have much control over the file and
>bandwidth that I'm downloading.
>
>I ended up using Free Download Manager on windows to download most of my
>file in the end.
>
>Torrent is not an issue, it works even better than windows in Linux with
>better security (encrypted and blocking bad ip range which is not that
>easy to do in windows).
>
>I'm a heavy downloader. I download a lot of file of miscellaneous
>category and nature.
>
>Browser download managers are bad, there resume option is the worst
>thing were I have to end-up re-downloading the file and moreover it
>blocks my ability to control bandwidth and for many other reason, I
>totally dislike browser (with addon) download manager.
>
>This is one area where I have totally failed with Linux.
>
>
>A point to make, in previous days, Internet in India used to be quite
>erratic behaved network, the speed was like a half AC wave cycle, rising
>from 0 to X speed and then falling down and going back to 0. Windows and
>its download manager used to easily work with such erratic internet, but
>in linux it was not. Today its quite well managed internet service in
>India the erratic behavior is under acceptable level but still wget
>issue for me remains the same.
>
>Regards,
>Ashok Kumar
>

I find Download Manager extension in Firefox works very well for me.

- -- 
Charlie Kravetz
Linux Registered User Number 425914
[http://linuxcounter.net/user/425914.html]
Never let anyone steal your DREAM.   [http://keepingdreams.com]
-BEGIN PGP SIGNATURE-

iQEzBAEBCAAdFiEEG5QK93YKrQMH22ZTiq6LjqbJ0IAFAlkPaOEACgkQiq6LjqbJ
0IBhmwf9EiI57EcQWwh38Hd/oKk9FbN6yRLYfC3deBeVi5EQWHSrek4vNKFP0KyO
5d/CTk3bs915UmkYueniTQlmISVkWdgzGjyZrp6hTOxxtZWkQSHCuwd8yYXiGwiN
fxb6GfBnE8PiAREusjrAeULWf1N/mdb7iPlZ6xuFsYzAs3EVI3K+BpyDzMLWFukM
mJGsVmjFs2efWUn3tLBRDQYCyj2tY/VHPrsb5O71J1XoxEViXnl1x74ehsFsh+Pa
bR5MIup/kMkjcB/FrcECYLYDwpKGiXb6aODjyH4M2Rn5tV9xqTc58KBLSUAe00CT
iLY0l3/g91NjuQsvvFWgWiEKhW5o8A==
=Hqr0
-END PGP SIGNATURE-


Re: debian-installer preseeding over https

2017-05-07 Thread Yvan Masson
Le 06/05/2017 à 01:03, Mario Abajo a écrit :
> Hello,
> Playing with unattended deployments of debian using foreman
> (https://theforeman.org/) i found out that debian-installer doesn't
> support loading the preseeding file from a https server. It do it well
> from a http url but using ssl never works. I have found an old question
> in stackoverflow about this
> (https://serverfault.com/questions/320019/how-to-use-debug-debian-preseed-with-ssl-using-startssl-certs)
> explaining that the problem comes from the wget in busybox not compiled
> with SSL support, it's old, but it's still true with the actual stable
> and testing releases. I would like to know how to fill a bug (wishlist)
> for this, also, i would like to hear some opinions about it; other
> distros have this support even with the fact that it's not perfect
> (because you trust all certificates, and that's not good) but at least
> you avoid simple sniffers for tacking your installation data (and hash
> passwords).
> 
> Thanks in advance,
>   Mario Abajo

Hi Mario,

It seems there is a open bug report already:
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=698528

Preseeding over HTTPS would be a very interesting feature, but if you do
just a minimal installation and then use The Foreman for everything else
(I have never used it), avoiding sniffers does not seem crucial to me.
Just use The Foreman to:
- check that important installation steps were properly done (correct
partitionning, only required packages installed, correct source.list,
correct time zone…)
- change password
- configure your machine

But unfortunately you are right, running over HTTP can be a problem: if
an attacker is able to modify the preseed.cfg, he could run any command
(see bottom of the preseed file example). Checking installation log
might not even be sufficient…

Best regards,
Yvan



signature.asc
Description: OpenPGP digital signature


Re: How stable is the frozen stretch?

2017-05-07 Thread RavenLX

On 05/07/2017 04:33 PM, cbannis...@kinect.co.nz wrote:

By the way, the words "unstable" "stable" as used in the distribution names
don't mean likely to crash, --- it refers to the amount of changes
occurring, i.e. 'stable' has no new packages entering it, and supposedly only
security updates, whereas "unstable" is unstable because there are many
changes occuring on a constant basis.


Thank you for this info. I admit I always thought "unstable" meant it 
might still have bugs or still be in beta. I don't mind when things 
change frequently because sometimes this is how one can get new features 
in a newer version of a program.





Re: How stable is the frozen stretch?

2017-05-07 Thread Cindy-Sue Causey
On 5/6/17, Michael Milliman  wrote:
>
> On 05/06/2017 04:55 PM, RavenLX wrote:
>> I am thinking about trying out Stretch (Debian 9) in either a spare
>> laptop or a virtual machine. If I like it I might just point my sources
>> list to that repo on both laptops if it's stable enough.
>>
> I can't speak categorically, but In installed Stretch a couple of months
> ago on my older laptop.  It has been running without a hitch 24/7 since
> then.


That's me, too. I can't remember exactly when, but i mentioned
something about it on here regarding Chromium several months ago. Sid
Unstable and I are on a break. It was just too much to keep up
with the FABULOUSLY active updates that are occurring.

So I skipped over Stretch and went with Jessie. Jessie lasted a grand
total of maybe about 3 days, I think it was. Websites keep complaining
that my Chromium was out of date. Unfortunately my Chromium was as
current as the repos had at that moment.

Tinkering to stay on Jessie was not a cognitively friendly option so I
stepped over to Stretch. If there has even been a tiny burp of a
problem, it was so small or was fixed so quickly that I don't remember
it.

#ThankYou, Developers! I'm about to do something extremely #Life
changing in a few minutes. I *literally* could not do it without all
the well functioning Debian packages I'm about to spend the entire
rest of the evening buried in

Cindy :)

-- 
Cindy-Sue Causey
Talking Rock, Pickens County, Georgia, USA

* aumix, mtp, inkscape, gimp, openshot, thunar, xine, notes > YOU'RE ON DECK! *



Re: How stable is the frozen stretch?

2017-05-07 Thread Michael Milliman


On 05/07/2017 04:19 PM, RavenLX wrote:
> On 05/07/2017 04:33 PM, cbannis...@kinect.co.nz wrote:
>> By the way, the words "unstable" "stable" as used in the distribution
>> names
>> don't mean likely to crash, --- it refers to the amount of changes
>> occurring, i.e. 'stable' has no new packages entering it, and
>> supposedly only
>> security updates, whereas "unstable" is unstable because there are many
>> changes occuring on a constant basis.
> 
> Thank you for this info. I admit I always thought "unstable" meant it
> might still have bugs or still be in beta. I don't mind when things
> change frequently because sometimes this is how one can get new features
> in a newer version of a program.
> 
Yeah, this is one of the main things sited as a drawback to the Debian
distributionpackages are sometimes a little older than in other
distributions.  But, this is because the Debian developers spend so much
time making sure that they work properly in the distribution before they
are released in the repositories.  As a result, things change a lot less
frequently.  The benefit of this is that Debian is 'stable' in all
senses of the word...few serious bugs and system instability, and little
or no instability in what is part of the distribution.  For many people,
especially businesses, this stability is important.  For others, like
myself, I can afford a little more instability, and so can deal with any
instability in testing for the benefit of getting newer versions of the
packages and run Testing (Stretch). Many people also run Experimental
(Sid) for the benefit of bleeding-edge versions of software, but a lot
of instability (in all senses of the word).
> 

-- 
73's,
WB5VQX -- The Very Quick X-ray



evince is missing support for djvu

2017-05-07 Thread Sergey Fukanchik
When I am trying to open a djvu file with evince i get the following
message:

Unable to open document "file.djvu"
File type DjVu document (image/vnd.djvu+multipage) is not supported

libdjvulibre21 is installed on my Debian GNU/Linux 9.0

How can this be solved? Do I need to compile evince myself?

-- 
Sergey


Re: Download Manager

2017-05-07 Thread David Wright
On Mon 08 May 2017 at 00:12:18 (+0530), Ashok Kumar wrote:

> > Maybe too much of negative thoughts towards a shell command line ?
> Not negative thoughts about shell, I'm just not comfortable. It's that I 
> cannot memorise/recall so many codes/scripts all the time, I even run down 
> track of remembering all the alias.

You shouldn't need to memorise them. From your rant, you're obviously
going to remember "wget", so you just start each alias with "wget-"
and the rest of the name explains that alias's features, or which
site it's for, or whatever sort of memory jog you need:
 wget-whichever-sites-it-is-for
 wget-whichever-features-this-one-has
 wget-whatever-else-bla-bla
etc. I assume you're familiar with command completion, so typing
$ wget-   will list all the various aliases (or functions)
that you've written.

Cheers,
David.



Re: Download Manager

2017-05-07 Thread Patrick Bartek
On Sun, 7 May 2017 23:37:53 +0530 Ashok Inder
 wrote:

> For past 7 years, Linux does not have a single descent Download
> Manager. Its totally possible that none of you may have faced this
> issue ever but its a somewhat major issue for me...
> 
> First with WGET, I'm not at all comfortable with cli, the other being
> [big rant snipped]

I guess, it depends on what you mean by "decent."  Take a look at
uget.  It's GUI-based and besides its other features, has Mozilla
Firefox integration.  Or load synaptic and do some searches.  No
commandline knowledge needed.

B  



Re: Download Manager

2017-05-07 Thread Cindy-Sue Causey
On 5/7/17, David Wright  wrote:
> On Mon 08 May 2017 at 00:12:18 (+0530), Ashok Kumar wrote:
>
>> > Maybe too much of negative thoughts towards a shell command line ?
>> Not negative thoughts about shell, I'm just not comfortable. It's that I
>> cannot memorise/recall so many codes/scripts all the time, I even run down
>>
>> track of remembering all the alias.
>
> You shouldn't need to memorise them. From your rant, you're obviously
> going to remember "wget", so you just start each alias with "wget-"
> and the rest of the name explains that alias's features, or which
> site it's for, or whatever sort of memory jog you need:
>  wget-whichever-sites-it-is-for
>  wget-whichever-features-this-one-has
>  wget-whatever-else-bla-bla
> etc. I assume you're familiar with command completion, so typing
> $ wget-   will list all the various aliases (or functions)
> that you've written.


I think I've posted before that I arrow up and down in my terminal,
too. It's not 100% reliable because the history kept in memory depends
on the order of how you shut down multiple terminal tabs AND windows.
The recently past history that is presented is specific only to each
logged in user's usage.

A while back I decided I also needed something beyond a text editor
file to keep snippets I simply then copied and pasted into a terminal
window and such(ly). I specifically went looking for "sticky notes".

Several different packages come up for that kind of search. I tried a
few then found xfce4-notes (xfce4-popup-notes) possibly already
installed in my xfce4 desktop environment. I use those A LOT on a
daily basis... :)

There are "clipit" type packages out there, too, that save recently
copied snippets from our "clipboards". I "use" it occasionally, but
the sticky notes dealy fits more in line with what I need..

Cindy :)

-- 
Cindy-Sue Causey
Talking Rock, Pickens County, Georgia, USA

* I comment, therefore I am.. (procrastinating elsewhere). *