Hi Thiago,

> Well, the idea of automated testing is to write the test before the tested  
> feature or at least just after you implemented the feature. 
> This quick outdating is quite strange: it seems the team is not writing tests 
> in a  
> more safe manner or the markup is changing very fast. Am I right?
Okay you got me... We rely mostly on our unit tests and have continuous 
integration only for those, so the Selenium test was only used for black box 
testing at the end of our release cycle, but was not part of the test suite so 
was kind of an afterthought. The code changed so fast that it defeated the 
purpose of having the test, sop we abandoned it.

We are under resourced and have tight deadlines, so we have limited time to 
write both unit and integration tests... but you tend to pay for sloppiness 
down the line so this is why I mention I would have to have a rethink about 
that, as unit tests are great but they are not always enough. 

Abandoning the Selenium test means we no longer test Pages, Mixins or 
JavaScript, but I plan to change this soon. One thing we have done right though 
is ensuring very clean code separation, so tests are easy to write without many 
changes. I will first look into Inge's suggestion, and if we are not covering 
enough the we may consider a few smaller targeted Selenium tests rather than 
one larger one as we did in the past.

Cheers,
Peter   




----- Original Message -----
From: "Thiago H. de Paula Figueiredo" <thiag...@gmail.com>
To: "Tapestry users" <users@tapestry.apache.org>, "P Stavrinides" 
<p.stavrini...@albourne.com>
Sent: Monday, 20 June, 2011 14:52:06 GMT +02:00 Athens, Beirut, Bucharest, 
Istanbul
Subject: Re: Unit testing mixins

On Mon, 20 Jun 2011 04:22:27 -0300, <p.stavrini...@albourne.com> wrote:

> Hi Inge, Thiago,

Hi!

> Firstly, Thanks guys for your replies!
> I was aware of Jasmine, but not JsTestDriver... together they look very  
> promising (powerful) for unit tests. I think I will give them a go.

You're welcome!

> As for Selenium, we had used it in the past for integration testing, but  
> we wrote only one larger scale test with broad coverage... It was pretty  
> good as well, but we found the test got outdated almost as quick as we  
> could write it.

Well, the idea of automated testing is to write the test before the tested  
feature or at least just after you implemented the feature. This quick  
outdating is quite strange: it seems the team is not writing tests in a  
more safe manner or the markup is changing very fast. Am I right? I work  
on projects which forbid commits without tests, almost all the tests are  
Selenium ones and we don't have this quick outdating you're experiencing.  
We do need to fix some tests after the markup it's based on changes, but  
this doesn't happen frequently.

> After a time we were not maintaining it because it was quite time  
> consuming to do. Worst case I will revisit Selenium and have a rethink  
> about better maintenance this time.

If your tests are failing, the recommended approach is to stop development  
and fix bugs (code or tests) until all tests pass.

-- 
Thiago H. de Paula Figueiredo
Independent Java, Apache Tapestry 5 and Hibernate consultant, developer,  
and instructor
Owner, Ars Machina Tecnologia da Informação Ltda.
http://www.arsmachina.com.br

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscr...@tapestry.apache.org
For additional commands, e-mail: users-h...@tapestry.apache.org

Reply via email to