Yes change the code of the Parser not to use old code.
As I pointed out, the Parser is a large and complex package, and I am not
the author, so this is easier said than done.
I'm dealing daily with code having the same characteristics.
It seems a bit odd to dismiss
code which worked OK in Moose 4.9, but not in Moose 5.0, as 'old code.'
I do not know what happened between 4.9 and 5.0 but if this is between 
pharo30 and pharo40 => one full year of efforts (you know
many people improving things and putting energy in the system). And be 
ready because we will remove URL old class.
Now the good aspect is that you are lucky (we are all lucky) that sven 
is improving constantly Zinc.
But there is no such world were unmaintained packages would 
indefinitively work while the world is changing.
  It
is code which has no functional equivalent in the new code, as Sven has
confirmed.
Then it means that you have to cooperate with sven to get Zinc improved.
And feel lucky you can.

The overall process may look frustrating but a system that is not changing is
See law of software evolution

(1974) "Continuing Change" — an E-type system must be continually adapted or it becomes progressively less satisfactory[3]
http://en.wikipedia.org/wiki/Lehman's_laws_of_software_evolution
Thanks again

Peter Kenny

-----Original Message-----
From: Pharo-users [mailto:pharo-users-boun...@lists.pharo.org] On Behalf Of
Sven Van Caekenberghe
Sent: 10 January 2015 16:36
To: Any question about pharo is welcome
Subject: Re: [Pharo-users] Problem due to deprecation of class Url in Pharo
3 and Moose 5.0


On 10 Jan 2015, at 16:45, stepharo <steph...@free.fr> wrote:


Le 10/1/15 11:41, PBKResearch a écrit :
Hello
I have run into a problem in moving some existing work from earlier
versions of Pharo/Moose. I have found a work around, but I wonder if there
is a tidier way of handling it.
I make frequent use of Todd Blanchard’s HTML parser and validator, HTMCSS
(http://smalltalkhub.com/#!/~ToddBlanchard/HTMCSSValidatingParser), which
was originally written for Squeak but has performed without trouble on
earlier versions of Pharo. When I try to use it on Moose 5.0, I get frequent
warning messages about Url being deprecated. I have made the problem go away
by commenting out the deprecation warning in Url class>>new, but I wonder
what should be done on a more permanent basis.
Yes change the code of the Parser not to use old code.
The problem arises because HTMCSS does not just parse the original HTML
file; it also loads and parses any referenced CSS files. (This is a function
I could do without, but I don’t fancy trying surgery on a complex package
where I only partly understand the workings!) It constructs the full address
of the CSS file by combining the root address of the HTML with the relative
address of the CSS, using Url class>>combine:withRelative:, and in the
course of this it invokes Url class>>new; hence the deprecation message.
use ZnURL
The deprecation message says Url has been replaced by ZnUrl, but this is
clearly  not a simple replacement of one message by an equivalent; there is
no ZnUrl class>>combine:withRelative:, for instance.
Sven will certainly comment but I guess that there is certainly the same
behavior.

ZnUrl>>#inContextOf: is the selector you are looking for, but path merging
is not supported (yet).

'readme.txt' asZnUrl inContextOf: 'http://www.host.com:8080' asZnUrl.

Maybe we should add path merging, I'll think about it.

The tidiest solution would no doubt be to find an equivalent method in
Zinc, which I am sure exists, and then modify the HTMCSS code to use it. I
have tried to find an equivalent, but Zinc is a large and complex system and
I rapidly got lost.
It should be in ZnURL

is it not addPathSegment:?


from class comment

   ZnUrl new
     scheme: #https;
     host: 'encrypted.google.com';
     addPathSegment: 'search';
     queryAt: 'q' put: 'Smalltalk';
     yourself.
host: looks like the root
addPathSegment: looks like https://encrypted.google.com/search

I wonder more broadly about the strategy of deprecating functions which
are required by legacy packages, as in this case. Should there at least be a
way of overriding deprecations? (I suppose that is what I have done by
commenting it out, but it seems crude.)

This is how you can work around Deprecation warnings:

[ Url combine: 'http://www.foo.com/one/two/' withRelative: 'bar/readme.txt'
]
   on: Deprecation do: [ :exception | exception resume ]

but that is a temporary hack.

Thanks in advance for any help or suggestions.
Peter Kenny
Sven






Reply via email to