Backtracing, Invalidated Bugs and Quality

2008-08-20 Thread Null Ack
Evening Devs,

Tonight I was doing some of my test suite and I had the
tracker-preferences crash unexpectedly doing routine workflow with
viewing (not changing) preferences. Apport came through and I ended up
at an invalid existing bug from 2007 because the user had not
submitted debugging symbols. This has happened to me before and my
mind has been busy since with thinking about how this detracts from
quality and what to do about it. These are real bugs, some of them are
in production, that are not being fixed.

I'm not convinced that the strategy of asking users to install
specialised debugging packages is the right way to go. I see a very
low hit rate with this working in practice. I have professional
experience in managing testing projects and consulting on related
fields so with Ubuntu being close to my heart I often think about how
we approach testing and what might be processes that could be
improved. Can I please offer some thoughts:

1. The Debug By Default Build. This would be where the entire
operating system is built using debug packages. This could be at a
targeted point of the lifecycle, such as during Alpha, where apport
will deliver all debug symbols by default. We could still distribute a
non debug build for users who must have that type of build, but it
could be hidden away so that the most common type is the debug build.
We engage some community evangilists who promote the importance of it
so it gets readily brought into practice.

2. The Hybrid Debug Build. Similar, but for technical reasons only
some packages are debug builds.

3. Extending Investment at the Canonical Test Lab. There is sound and
proven arguments I could help to present that demonstrate the cost to
fix defects as they progress in the lifecycle, both in terms of
monetary costs as well as costs to things like image, future sales and
so forth - how it increases at an escalating rate the further on it
progresses in the lifecycle. A business case could be built that looks
at extending whatever Canonical Test Lab exists now with the mission
for capturing the higher priority backtracing bugs and replicating
them in house under controlled conditions. My consulting career has
been based in Australia and I only have knowledge of what the rates
are for various testing roles in my country. I do though understand
that some other countries have far lower rates. I'm not suggesting
exploitation of cheap labour but I am suggesting that the labour costs
could be reduced considerably with choosing a location for the lab at
fair market prices. It might be additionally possible to setup a large
scale test lab using donated hardware from partners to Canonical. A
more aggressive strategy could be looked at into the future, such as
having planned multiple phases. Like another team looking at building
the automated test harness up to the point where most function points
are tested all night, every night, in every build before it gets
posted to daily ISOs. The results could be datamined by automatic
processes that then engage Ubuntu developers or upstream projects
automatically.

4. Extending The Ubuntu Entry Criteria. At least from my perspective
(which maybe insular) the practice of the Ubuntu release methodology
for eligibility of new code into existing packages is something along
the lines of has Debian accepted it and does it compile. I fully
understand that upstream projects lack person power with runtime
testing and they need their code to be included in pre-release
distro's to be tested. One thing that has gotten results for me in
projects I've managed is not just focusing on runtime level tests.
Static testing tools really can be useful and can be quite specific.
It's possible to set arbitary benchmarks for release entry criteria as
a minimum standard. You can set levels of compliance such as mandatory
where certain code problems are specifically banned, and others with
an allowed number of warnings and so on. I realise this would need
careful implementation but I think chipping away at it piece by piece
could realistically over time became an accepted part of what upstream
projects do in a standard way to demonstrate their new code changes
are ready for distros to look at.

These are just some ideas I had. Anyway, sincerely thanks for Ubuntu
and all the work that goes into it.

Regards

Nullack

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Re: Backtracing, Invalidated Bugs and Quality

2008-08-20 Thread Null Ack
2008/8/21 Markus Hitter <[EMAIL PROTECTED]>:
>
> Am 20.08.2008 um 11:42 schrieb Null Ack:
>
>> I'm not convinced that the strategy of asking users to install
>> specialised debugging packages is the right way to go. I see a very
>> low hit rate with this working in practice.
>
> How about getting this even more automated? Apport would have three buttons:
>
>  [ Abort ]   [ Submit Report only ]  [ Allow getting bug fixed ]
>
> The third button would not only send the bug report, but replace (apt-get)
> the standard package with a symbol-equipped equivalent as well. Having a
> debug version of a package among standard packages hurts only neglible and
> most users won't even notice.
>
> Voi-la, next crash time Apport will come along with a backtrace.
>
Markus I particularly like your suggestion here. If there are certain
types of bugs that cannot be fixed without backtraces of debugging
symbols we must come up with easy tools on the desktop that creates
those conditions.


>
>> 1. The Debug By Default Build.
>
> Good idea, but the distro won't fit on the CD any longer. Don't know if this
> is an issue for developers.
>

Personally I dont care about the size, I'd just burn a DVD.


@Bryce - I dont think it matters what other processes other projects
use. To my way of thinking it is about process improvement and having
processes that are all geared to delivering the outcomes. Outcomes
that show Ubuntu to have rock solid stability, to be easy to use, to
have a quality user experience and so on.

@Emmet - I think it's unhealthy to treat the difficulty/time in fixing
bugs to the developer as the criteria for what gets look at. A quality
user experience should be the primary factor and any developer in my
book who's committed to Ubuntu quality would be tenacious about
chasing it.

Back to Markus:

>
>> 4. Extending The Ubuntu Entry Criteria.
>
> This would hobble invention of new packages immediately. As seen with the
> recent Empathy discussion, new packages don't go straight from the
> developer's alpha release into the distribution CD anyways.
>

I'm not so sure it would hobble open source software projects. Can I
please explain more fully? I am talking about packages that the Ubuntu
architects have all ready allowed into the distro. In this case for
example, we might be considering allowing in a new revision of gedit
into the alpha repos. I'm not talking about new packages all together.

Best practices on commercial projects that I've seen would involve
something along the lines of:

* Devs come up with the new code
* It is fully code reviewed by a human and made to meet certain benchmarks
* Static testing on the code occurs using static testing tools and
made to meet certain benchmarks
* and so on

In the case of the Ubuntu with our example new version of gedit:

* Has any code review been done?
* Has any static testing tool looked at the code?

As to the implementation, as I said it would have to be carefully
implemented. Can I summarise please:

Core basis for my extending the entry criteria argument: The earlier
problems are fixed, the less compounding multiplier effect of time and
money goes into fixing it.

I'm suggesting a staggered implementation. There are many ways this
could be done, one might be:

1. The Ubuntu security team start a pro active security initiative
that uses a static test tool to identify problems with memory
management that are security problems. The security team contact the
upstream projects with saying something along the lines of "were using
this code analysis tool and we suggest your code has security
problems".

2. Case studies and outcomes are shared on the websphere. Promotion of
the benefits occur over time and open source interest rises.

3. Ubuntu makes the leading step in showing their commitment to
quality by requiring that all upstream projects run the security test
static test tool before it will be accepted into the repos. Tools are
bit to make this pretty easy for upstream.

4. As time goes on, this becomes second nature. More people get
interested in it and adds on are written that expand what the static
test tool looks at and expands the rules regarding acceptance of new
code from existing repo packages.

Skip to Step 22: Imagine my ultimate vision where every upstream
project is required and does perform extensive static testing on their
code and there is pages of standards about criteria for Ubuntu entry.
Imagine a teenager with a killer idea for a really cool app, and he
comes along to IRC and says "Oh, what the heck, why do I have to deal
with this crap?" And the cowboy developer is responded too by a
seasoned open source dev guru who replies "because it results in
better code, with better quality, with better user experiences without
encumbering you with doing it all yourself".

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Correct Process for Package Update Requests

2008-09-01 Thread Null Ack
Gday folks,

Can I please be clarified on what the correct process is for package
update requests?

On 27th of June I asked the MOTU mail list what it was and advised
"The correct way to do this is to file a bug against the package and
tag it "upgrade"." Since I've done this for 6+ bugs.

Yesterday I filed two bugs complying with this, one of them was a MOTU
package not a core dev one, and I was advised in the bug comments that
"there is no need to open new version update request". Also when I
asked a dev who's interested in this function area (video) on IRC if
he could confirm the bug he felt it was'nt useful to bug upgrade
requests.

I think we need general agreement on the process here.

Regards,

Nullack

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Re: Bugs for NM 0.7

2008-09-05 Thread Null Ack
To add to this, we have some serious regressions with problems of not
being able to consistently apply static IPs as well as custom MTU
values:

https://bugs.launchpad.net/ubuntu/+source/network-manager/+bug/258743

https://bugs.launchpad.net/ubuntu/+source/network-manager/+bug/256054

http://bugzilla.gnome.org/show_bug.cgi?id=548114

I'm concerned that gnome seem to be pushing through beta 1, beta 2 and
onwards without resolving these bugs.

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


The Case For Re-Evaluating Our Release Approach To FFMPEG

2008-09-08 Thread Null Ack
Gday everyone,

It was suggested to me on IRC that I should discuss this matter on
this mail list.

Summary : I think we need to have regular snapshots of svn ffmpeg,
libavcodec and so forth released in both the current development build
and as backports to production builds. User's expect to have video
experiences atleast as good as Windows and Mac, and this is necessary
for actually delivering that.

My argument :

To be honest my original approach with meeting my video needs on
Ubuntu was to turf out the default apps and do my own custom compiles
of mplayer, mencoder and gnome-mplayer. This continues to work well
and frankly is still superior to what I can do under gstreamer and
totem (such as deinterlacing and other video filters). However I felt
guilty about doing this because I was not supporting the Ubuntu
principle of having one standard method for doing things and I was
restricting the value of my testing work I do on Ubuntu by not using
default applications in all circumstances. So some time ago I bit the
bullet, committed myself to using default apps and leaving mplayer for
any related tests.

I am thankful for Sebastien's updates to the gstreamer good and ugly
plugins recently, as well as the updates Intrepid has received with
Totem.

However, the ffmpeg gstreamer plugin is a key plugin for most user's
multimedia experiences. It provides to gstreamer:

* 256 elements
* 39 types

Of particular note amongst these many features is that some very
common video formats are used by gstreamer, such as AVC / H.264
decoding. AVC is one of the formats that is gaining much momentum with
it being widely used in BluRay, HDDVD, some Digital Video Broadcasters
and as an efficient backup format for personal media. As a subscriber
to the ffmpeg commit mailing list I know that in the past months there
has been substantial improvement to the code for AVC decoding and the
resolution of many related bugs.

AVC is just one decoder that ffmpeg handles out of many decoders that
has had many bug fixes in the past months.

Since gstreamer released a new ffmpeg plugin I have been enthusiastic
to see this arrive into Ubuntu and have Intrepid enjoy the more
reliable video experience this would offer our users. I'm advised
though that what is needed is to upgrade ffmpeg and related libraries
across the board to deliver the new gstreamer plugin. Upgrading ffmpeg
across the board would also give benefits to more advanced Ubuntu
users, whom for example maybe conducying video transcoding via
libavcodec. They wont need to suffer known bugs with old ffmpef
builds.

I want to note how the FFMPEG project manages releases:

* They dont do them
* Their standard response in reporting bugs is to compile SVN and retest.

What seems to happen in practice for FFMPEG in Ubuntu is that it
rarely is updated  - Intrepid's packages are currently seven months
old. On an upstream project that has numerous commits daily.

I feel bad for our users because I see bug reports on Launchpad that I
know is never going to go anywhere because ffmpeg currently isnt kept
up to date and is not backported for their build.

Anyone who has a passing observation of the situation has to agree
this is not ideal. I contend the risk of having old binaries in the
repos and all the problems that brings with poor user experiences
outweighs the risk that new code will bring new problems. My practical
experience of doing my own compiles of SVN head has consistently been
things are fixed and enhanced. On one occasion I had a problem where
the code would not compile and on another a bad commit occurred, which
effected functionality, but that was fixed in half a day and I simply
recompiled. Upstream strive for the SVN build to be fully functional
and in my experience thats meet on nearly all occasions.

My skills are not in packaging, but I can certainly assist with
testing and helping construct a freeze exception rationale for
Intrepid. Please consider.

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Re: The Case For Re-Evaluating Our Release Approach To FFMPEG

2008-09-12 Thread Null Ack
2008/9/10 Reinhard Tartler <[EMAIL PROTECTED]>:
> "Null Ack" <[EMAIL PROTECTED]> writes:
>
>> Summary : I think we need to have regular snapshots of svn ffmpeg,
>> libavcodec and so forth released in both the current development build
>> and as backports to production builds. User's expect to have video
>> experiences atleast as good as Windows and Mac, and this is necessary
>> for actually delivering that.
>
> The main problem is lack of manpower. Every time ffmpeg is updated, we
> can more or less expect applications and libraries that use them to
> break.
>
> FWIW, the next upstream snapshot that I'm preparing for
> debian/experimental right now is going to drop nearly all
> patches. Packaging new snapshots should become pretty easy then.
>
Thanks for the responses guys.

Reinhard I'm excited to hear about the progress with dropping many
patches and streamlining the process for synching from SVN. I'm also
thankful for your interest in bug 263153 which I think is likely fixed
in the latest gstreamer ffmpeg plugin release.

I understand about person power and I will comit to helping you with
testing new ffmpeg releases and related applications. I have a test
library that involves many different containers and compression types
and other features. I'm somewhat new to gstreamer but I've got a
pretty solid understanding of digital media technologies and
practices.

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Re: Backtracing, Invalidated Bugs and Quality

2008-09-12 Thread Null Ack
Thanks for all the discussion on this folks. :)

Just now I had a crash in totem with apport leading me to 9 previously
reported bugs that are either invalid or incomplete because the bug
reporter did not do a backtrace to help fix the problem. Now I have
the same issue, when it was originally reported in the first bug
report all the way back in May 2007 with no concrete progress since.

On top of this, people have said that its a recurring discussion that
comes up every six months or so, so lets fix this eh.

To recap, I've suggested that all Alpha builds could be debug by default builds.

Others, such as Markus have what I frankly think is a better idea
where apport tells the user the situation and downloads a debug
version of the package and waits for it to occur again. Then it sends
the backtrace to the right bug for analysis.

Krzysztof seemed to have a promising idea similar to apparently what
MS do "The information about debugging symbols
is only needed on server, client only sends (in simplest version) MD5
sum of library and address offset, which is transformed into the
symbol by symbol serve"

Can we focus on a debate about what the best approach is? This in turn
can lead to the details with implementation.

Thanks

Nullack

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Re: Backtracing, Invalidated Bugs and Quality

2008-09-13 Thread Null Ack
Gday everyone. As part of my work with the QA Team I want to
contribute to fixing the process gaps in this area. Can I summarise
what I see as the problem:

Problem situation: I'm increasingly noticing that certain types of
bugs are being marked invalid or incomplete with boilerplate type
messages instructing the bug reporter to conduct a backtrace. The
engagement of the end user is poor, the user experience is
non-intuitve, the documentation to walk through how a user can do this
is poor and the net effect is that the "hit rate" of users actually
fulfilling the request is very low. The net result is usually that the
bug stagnates and duplicate bugs pile up. It may, or may not, then get
filled upstream if the number of duplicate bugs gets high enough for
somebody to notice it. I have a high regard for all the Ubuntu
developers, and I say this carefully, but I think if were all honest
about the situation like some developers have been to me there is an
element of "gee Im so busy and this bug report looks non trivial...I
might just copy and paste my back trace wording to this, move the
status to get it out of the way and if it's a real problem eventually
a bunch of users will report this too and then I might send it
upstream then because upsteam have the time to look at this."

There was a discussion on IRC which I've summarised here for the list
and some proposed action items:

1. The Tool Chain For Debugging is Not Robust

The point was made that the debugging toolchain is complex and will
not consistently provide the needed debugging information on all
occasions. Sometimes the retracing will fail for some reason. I simply
see this as a longer term challenge for the FOSS community to work on
bugs in the toolchain - obviously having a reliable and repeatable
method for getting right into the guts of the registers and stack is
important for fixing the more curly bugs. The toolchain being
imperfect however is not an excuse for failing to implement best
practices in Ubuntu for debugging in the meantime. We can make
progress.

2. The Volume Of Bugs Coming Through Makes The Hard Ones Too Hard

It was suggested that the number of bugs coming through is so high
that trying to fix the more tricky ones isnt worth the time given
available amounts of person power. I made the point and I'd like to
highlight it again that the complexity of fixing a bug should not be
the criteria for which bugs get developer attention. The best practice
for building quality in Ubuntu in my view is the determinants should
be how seriously it effects the user experience and how common that
user experience is. When I've got stuck in my testing work on Ubuntu
I've appealed for help in the testing section of the Ubuntu forum or
on IRC, and I've been greatly encouraged by good helpful responses
back. Im sure bug squadders and Ubuntu testers would be happy to
respond to developers with unit testing, feedback etcetc. I recently
helped Alexander Sack with performance feedback on a web browsing item
and unit testing - it was fun!

2. The Need For Improving Apport

A developer suggested that there is not a gap with Apport as it exists
now. I disagree and I sighted the example of where a package is
compiled with optimised compiler flags that a debug package will need
to be installed to get a meaningful trace. I know from experience that
automated ways of doing this, or atleast having an easier and
intuitive user workflow for this is better than documentation. I
really like Markus' ideas for improving Apports functionality he
shared earlier.

Action Item 1: I'm not a developer, but I can help any developers with
testing and feedback for enhancements to Apport. I might also be able
to assist with design / blueprints / discussing possible features. Or,
someone come up with compelling reasons why Apport is fine the way it
is and the worflow issues can be resolved another way.

Another thing that came up in the talks was that the backtrace
boilerplate copy and paste wasnt always accurate in the circumstances
its being used. Sometimes the real issue is being able to replicate
the problem not the backtrace. Or, a backtrace on a debug build is
truly needed but the user doesnt know how to help in detail and bug
squadders cant replicate the problem at will on their configurations.
Or, since there is considerable obselete info hanging around there is
confusion with bug squadders about what exactly to do and human error
has occured.

Action Item 2: A review of the documentation on both the user side and
the bug squadder / developer side to more fully explain and walk
people through the situation. I can help here too, but again I'm not a
developer so especially the more technical aspects of the backtrace,
why it sometimes fails, how to do manually, will need other peoples
involvement. Basically to improve the hit rate.

That seems to be what the IRC logs touched on, thanks.

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify setti

OpenAL Regressions In Intrepid

2008-09-23 Thread Null Ack
Gday everyone,

The Linux Standard Base is surely a good thing. I don't know if OpenAL
is included in the LSB or not. What I do know is that someone decided
to change naming for OpenAL in Intrepid and this is causing many
regressions in other apps that now can't find OpenAL.

Can I please refer people to this bug:

https://bugs.launchpad.net/ubuntu/+source/openal-soft/+bug/273558

Some questions that come to mind are:

1. Why did we change the naming?
2. What is the best solution in the long term here for us?

Regards

Nullack

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Re: Any news on skype+pulseaudio+intel_hda_realtek ?

2009-02-10 Thread Null Ack
So in essence Scott, due to what you've highlighted as a lack of
testing input during the pre production lifecycle phases, your
suggesting that end users should endure the brunt of testing? As
Ubuntu needs to move forward rapidly, being cutting edge and cant be
so highly concerned with the risk of regressions?

Yes, pulse audio was implemented. Yes, it was a disaster and even the
upstream developer head more or less said so about Ubuntu's
implementation. It was half baked. So too is Compiz, with all its
incompatibilities with things like 3d OpenGL, that Ubuntu decided to
enable by default even though we all know that key architectural items
are missing like GEM. Lots of new users clambered onto the look at my
cool wobbly windows Linux stuff then were disheartened when they
realised that it didnt work properly, and that there is many other
visible bugs in the Ubuntu desktop experience. A bug in NM I reported
way back in the alpha still isnt fixed, that for my user experience,
is a nuisance. Cruft remover was poorly tested and entered production
in a problematic state. I could go on, but I wont.

If Ubuntu and Canonical are truly serious about quality, clearly the
professionals amongst us who sport big cowboy spurs with a good ol
wild western release philosophy need to be tamed. Otherwise, we might
as well all join Fedora. Thats not the Ubuntu I want to be involved
in. I want to contribute towards a robust system that provides a
quality desktop user experience. I'd like to reinforce Andrew Morton's
comments when he expressed an observation that too many kernel
developers focus on new features without resolving existing problems.

We are far better off focusing on improving the testing phases than
dumping it on end users. We will only alienate new users and limit the
strategic growth of Ubuntu if we go all cowboy.

Regards

Nullack

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Reasons Why Jaunty Will Not Ship With 2.6.29

2009-02-11 Thread Null Ack
Can I please be advised of why Jaunty will not ship with 2.6.29, and
that the kernel team has elected to ship on .28?

I'm sure the kernel team are aware of the many driver changes in .29,
but I'm not clear if they they propose to backport those into .28?
What about features? Or any patches that for one reason or another
have not made it into .28 as well that really should have gone into
.28 as fixes. I would appreciate being better informed of how its
proposed this will be managed.

Regards
Nullack

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


RE: Reasons Why Jaunty Will Not Ship With 2.6.29

2009-02-11 Thread Null Ack
As do I Scott, but I am careful to distinguish between features and
fixes. I'd like to know if .29 fixes will be backported into Ubuntu's
.28 and how the thing will be managed.

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Ubuntu Desktop Security Defaults

2009-03-15 Thread Null Ack
Gday folks :)

There is difference between what I foresee as sensible security
defaults for our desktop build against what is being currently
delivered. It may very well be that there is aspects to the current
setup that I am not fully aware of, and I'd like to better understand
the reasoning behind the current situation if so. Otherwise, perhaps I
could please suggest some possible enhancements:

* Enabling UFW by default or some other firewall by default
* Having AppArmor actually protecting the desktop build rather than
what seems as currently a false illusion of coverage with just CUPS
being protected

In my view the users want to feel secure in knowing that should a zero
day exploit be identified, that AppArmor or SELinux or foo or whatever
will trap the damage the exploited service can take beyond the
standard user is not root UNIX setup.

Thanks and regards,
Nullack

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Re: Ubuntu Desktop Security Defaults

2009-03-17 Thread Null Ack
Gday John,

Good to see another Aussie on the list and contributing some top info :)

I've looked into Plash and I think your suggestion is excellent.

I was thinking of a two pronged approach:

1. AppArmor / SELInux or whatever static like central policy to
contain deamons, as these services typically have fixed functions and
can be locked down in a static way. I note here that Microsoft did
this locking down for Vista services, where they went through all the
services and implemented a least privileged model. We could exceed
Windows by doing least privileged but also protecting it through
mandatory access control policies as well.

2. A longer term secondary phase of securing X. Again we find
ourselves behind Windows where for Vista the security of their system
was made more resilient against shatter attacks with a number of
changes to make it far more difficult. Depending on the specifics of
how X is secured, sandboxes like Plash could be considered too.

I do disagree with you on enabling a firewall by default. What you say
is well informed - yes, you can use injection attacks to bypass
firewalls. A firewall is a basic level of protection that Windows and
OSX use by default. Attacks have to be more sophisticated to
circumvent a firewall using injection attacks for example.

Regards,

Nullack

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Re: Ubuntu Desktop Security Defaults

2009-04-14 Thread Null Ack
Considering some noise happening in the blog space over a Linux
magazine article about security problems with Ubuntu server I think we
should re-visit this topic. The article is at:

http://www.linux-mag.com/id/7297/2/

The key criticisms of Ubuntu server raised by Linux magazine are:

1. Default permissions of users home dirs open by default
2. Install allows for blank mysql root password
3. Allowing system accounts unnecessary shell session authority
4. Nonsensical deamons listening on the network despite other
configurations servicing those needs

In our previous discussion on this topic here, I introduced some
personal concerns I have with Ubuntu desktop security with:

1. No firewall enabled by default
2. That AppArmor is providing a false sense of safety for users in
controlling the damage zero day exploits could potentially do.
AppArmor only protects one daemon, CUPS. By default it does very
little.

The reality is that other desktop distros such as Fedora have a far
stronger set of security features than our beloved Ubuntu,

I think we need to make progress on these issues. I think John
previously made an excellent suggestion about using something like
Plash with hooks into GTK.

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Re: Ubuntu Desktop Security Defaults

2009-04-14 Thread Null Ack
Thanks Mathias. I note that discussion is limited to the Server build,
whereas this discussion has both desktop and server build topics.

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Re: Ubuntu Desktop Security Defaults

2009-04-14 Thread Null Ack
> I guess I was hallucinating working on the apparmor profile for
> clamav-daemon and freshclam (also run as a daemon) today.
>

Thats great, though Scott please don't make the mistake of taking a
strawman approach. What I said was about AppArmor defaults. I dont see
my current dev build of the desktop having any profiles loaded by
default other than CUPS.

If the considered opinion is to continue with AppArmor then clearly
getting more profiles into it is the way to go.

However, if you look back into this discussion thread I think John
made a very sound set of points about the limitations of AppArmor /
SELInux etcetc type approaches for a desktop system and weaknesses of
X security. He makes what seems to be a very sound suggestion about
Plash and hooking into GTK, thus overcoming the problem of needing to
in advance make determinations about what a desktop user might do and
the X security problems.

Regards
Nullack

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Re: shameful censoring of mono opposition

2009-06-07 Thread Null Ack
I only have a passing interest in this, but its sufficient to ask this:

* What actually is the rationally logical problem with Mono being in Ubuntu?

What I see on that Boycott novell website is lots of fear mongering
and little hard truth.

1. Its only the windows compatability stuff like windows forms that
*could ever be considered a patent problem*. Mono isnt really complete
in this area anyway. .Net has open standards to it. The issue is the
parts that related to the windows platform specifically.

2. MS wants to defeat Java, and hence wants cross platform
interoperability. Isnt it good to have multiple cross platform
technologies, java competing against .Net and so on?

3. MS has not sued anyone for Mono related stuff and the suggestion
they will as if its a foregone conclusion is based on poorly
considered fears. How can MS expect to match Java if they trap the Mac
and Nix platforms with patent law suits for windows compatability
layers?

4. Not everywhere in the world recognises software patents. This
foregone conclusion that fear mongers are touting conveniently ignores
this fact.

5. I dont see WINE in court with MS, which, on the basis of ripping
off MS IP is undoubtedly more of an infringer than what MONO could
ever be considered to be.

I'm fully prepared to listen to well constructed rational arguments to
the contrary.

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss


Re: Pulse audio

2009-10-07 Thread Null Ack
Hi Lukas,

I dont think the ordinary user cares about PulseAudio or other
internal components to their desktops. They just want audio to work.

The problem that I see is not so much about internal components, but
is about failures in:

1. Not delivering reliable audio experiences in production releases of Ubuntu
2. A breakdown in the development process where my bug reports and
others bug reports remain unresolved, and largely unanswered, except
for bug spam messages like "I have this too" and "I think this might
be related to buy XYZ".

Right now my Audigy 2 card crackles / pops unless I disable mixers in
alsamixer, where I then loose some sounds. I accept that in a dev
cycle these things will happen, and thats the reason I contribute my
time with testing, but I find the audio bugs go on without resolution
from previous cycle experiences. When I try to use it in applications,
say warzone2100 I find the sound a garbled inaudible mess. Since bug
reports dont seem to be effective, I've tried to get discussion going
on if I could take the problems upstream or if they were Ubuntu
specific problems but that was also left unanswered.

Other bugs in other internal components are actively resolved during
the dev cycle so I think the issues are about a lack of capability in
the audio space for Ubuntu.

-- 
Ubuntu-devel-discuss mailing list
Ubuntu-devel-discuss@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel-discuss