11
jorgebarrero
Re: Action Item #1 : let's start !

I have many comments on this smoketest. I am traying to produce a full fetuared document pointing to:

Keep simplicity
Join the previuos discussed issues
(Readme info by Jen, Consistency in language usage, ad the way to evaluate)

I tried to do the first run of this smoketest But I realized about:

1. Where do I put info about where I am running the test.

2. How Ho I Qualify one aspect (lets say D1 that has three issues)

3. I imagine I can run a test as others will, then we should compare. I imagin is a good idea to consolidate several evaluations & avarage.

4. I would separate the "formal" evaluation. That means all that cannot be appreciated by the common standard user (we have to define what is that, and is not exactly a newbe).

5. I have experienced the fact not even knoing easily what the module is for. A clear description (let say in the readme would help, and should be considered)

6. General impression, Consistency & look and feel should also be added).

My next step is testing a module with this tool and give some feedback


Jorge Barrero

12
jensclas
Re: Action Item #1 : let's start !

I have looked at the spreadsheet - the Doc tests look good.

In addition to what Mith wrote we need a how to provide adequate docs for a module - people infamiliar with writing texts and instructions need a model to follow. I am happy to do this but I will need input from the community about what should be in this how to. Perhaps after you have tested a few modules you will have a list of things that should be there.

Cheers

13
Marco
Re: Action Item #1 : let's start !
  • 2005/3/2 21:21

  • Marco

  • Home away from home

  • Posts: 1256

  • Since: 2004/3/15


Quote:

jensclas wrote:
Under module design you have - install process - described in the read me file - I think you need to also have read me file - is included and covers 'required' details. eg title, version, author, license statement, install instructions, and credits. They also need to be 'readable' and reasonably correct in their use of language.

There are many modules that do not have a read me file at all - and there are tons of veriations in the content of a read me file. There is a need to set up some basic expectations of what would be in a read me file. Look at my article on read me files here - it is still in draft form but you will get the picture.

Other than that I think it looks good - but then I am not up to the rest of it - just looking out for newbies and readability.

off my soap box now!


hum, I've just opened this nice document.15pages !!!!!
ok, will have a look into.
We should create a rule for docs, as you're proposing.
give me some days to read this !
marco

14
Marco
Re: Action Item #1 : let's start !
  • 2005/3/2 21:28

  • Marco

  • Home away from home

  • Posts: 1256

  • Since: 2004/3/15


Quote:

Mithrandir wrote:
Looking better

What will be next step, once this specification is finished? Will you (with the Docs team, perhaps?) make a "Guide to Approval", explaining how to accomplish each task? I think that would be a tremendous guide.

I think we should improve this doc to make it more understandable...giving some examples
perhaps for coding aspects, we should list all criteria.
our goal is to make this test usable for newbies...

Quote:

You say yourselves in "S4 - Textsanitizer usage" that instructions on correct usage should be available and I think other tests could use the same (e.g. what constitues "Smarty compliant"?), so might as well make a section for each test, explaining
a) its purpose
b) how to make a module pass this test and
c) how the test is performed

you're right.
but perhaps should we not going too far, because of XOOPS V3. I'm waiting for v3 roadmap !!!!!

Quote:

I could also suggest a new category - C (for Code Standards) - where you take the P2 test about object orientation as well as general coding standards and core usage. I suggest that this new category has the following tests:
P2, A5 (Note that there are TWO A5's, I mean the one about permissions), A6, A7 and A11. An added use of core Notification (where applicable) could perhaps also be a new test?

I will have a look into. Good idea.

Quote:

Only one piece of feedback left now: B1 - it doesn't really matter to me, when I install a module, whether it has been in Beta and/or RC before being released. It is a nice reassurance for me to know that it will probably have fewer bugs than a module that was released without it, but since you ARE testing the module for bugs etc. I think it will not be fair for a module that passes ALL tests apart from this one to be noted "Did not pass this test"... when everything is working, the need for a Beta/RC is limited, I think.

I think we should not test RC or beta, but it's important to make people think publishing an RC has a quality value.

Quote:

It may serve to be an explanation for why the module fails other tests, and you should be able to conclude after performing the tests that it should have been released in beta/RC versions prior to final launch, but it should not be a test in itself.

should I add an test's comment part ?

Quote:

Use what you can of this and I am of course available for working with you on the coding standard and core usage tests.

sure I need some help from others, especially on this part.
But take care about V3 impact !!!!

15
Mithrandir
Re: Action Item #1 : let's start !

I think the Excel document is excellent as a checklist for the tester to note the test results, but the test should result in a report, I think.

I imagine something along the lines of a document explaining each test, why is it tested (purpose), how is the test performed (procedure), what result is needed for it to pass (expectation) and how this is accomplished in example (explanation)

This document will tell developers how to program and document their module in order to pass the tests.

Now, when the tester has tested the module, using the spreadsheet to note the progress down, he or she takes the above document and adds sections to each test describing the results during the test (observation)

The report should end up with a conclusion on the tests - noting how the module performed in the tests and if you create a couple of categories for modules (such as Group A = All tests passed, Group B = Insufficient core usage, Group C = Insufficient documentation, but module is working and Group D = Unacceptable Bugs in module) which category this module is placed in and why. Recommendations like better testing procedures (if the bugs are so that they really should have been found in a beta/RC testing phase) could also go in this conclusion.

Quote:
but perhaps should we not going too far, because of XOOPS V3. I'm waiting for v3 roadmap !!!!!

Text sanitation is quite simple in XOOPS 2.0.x (as described in the Dev Wiki (although it may not be spelled out sufficiently WHEN one should use one over the other) and since it will not be changed very soon (see Rowd's report from XOOPSDEM) my opinion is that it should be included from the start and adapted for future versions of XOOPS.

I imagine that the code standards part will be expanded quite extensively with the development of this next major XOOPS version as we will put far more focus on people using the core correctly when we get that "clean sheet" foundation to work from.

16
Marco
Re: Action Item #1 : let's start !
  • 2005/3/2 22:16

  • Marco

  • Home away from home

  • Posts: 1256

  • Since: 2004/3/15


Quote:

I could also suggest a new category - C (for Code Standards) - where you take the P2 test about object orientation as well as general coding standards and core usage. I suggest that this new category has the following tests:
P2, A5 (Note that there are TWO A5's, I mean the one about permissions), A6, A7 and A11. An added use of core Notification (where applicable) could perhaps also be a new test?


I keep it in mind for next release. Do you agree with mith proposal ?

Except this, here is new version
marc

17
Marco
Re: Action Item #1 : let's start !
  • 2005/3/2 22:26

  • Marco

  • Home away from home

  • Posts: 1256

  • Since: 2004/3/15


Quote:

Mithrandir wrote:
I think the Excel document is excellent as a checklist for the tester to note the test results, but the test should result in a report, I think.

hervet suggested a QA module !!!!


Quote:

I imagine something along the lines of a document explaining each test, why is it tested (purpose), how is the test performed (procedure), what result is needed for it to pass (expectation) and how this is accomplished in example (explanation)

This document will tell developers how to program and document their module in order to pass the tests.

Now, when the tester has tested the module, using the spreadsheet to note the progress down, he or she takes the above document and adds sections to each test describing the results during the test (observation)

yes, we need this.
our work should 1rst be usefull for and used by devs.
then after to QA testing team.

Quote:

The report should end up with a conclusion on the tests - noting how the module performed in the tests and if you create a couple of categories for modules (such as Group A = All tests passed, Group B = Insufficient core usage, Group C = Insufficient documentation, but module is working and Group D = Unacceptable Bugs in module) which category this module is placed in and why. Recommendations like better testing procedures (if the bugs are so that they really should have been found in a beta/RC testing phase) could also go in this conclusion.

we have to decide which rating model, we use. stars ? or Note ?

Quote:

Quote:
but perhaps should we not going too far, because of XOOPS V3. I'm waiting for v3 roadmap !!!!!

Text sanitation is quite simple in XOOPS 2.0.x (as described in the Dev Wiki (although it may not be spelled out sufficiently WHEN one should use one over the other) and since it will not be changed very soon (see Rowd's report from XOOPSDEM) my opinion is that it should be included from the start and adapted for future versions of XOOPS.

I imagine that the code standards part will be expanded quite extensively with the development of this next major XOOPS version as we will put far more focus on people using the core correctly when we get that "clean sheet" foundation to work from.

[/quote]
yes, that's it.

18
rowdie
Re: Action Item #1 : let's start !
  • 2005/3/4 14:01

  • rowdie

  • Just can't stay away

  • Posts: 846

  • Since: 2004/7/21


I think the work done so far is great, but it's starting to move away from designing smoketests and towards standards. I'm working on Action item #2, writing a document describing the QA standards. Your work here seems to be crossing over into what I'm doing, which is probably not the best use of your valuable time

I suggest leaving the Section 1 - Module Design section of the smoketests until the standards document is finished.

Then for Section 2, "Module Acceptance", you really should start making detailed descriptions of how you will carryout your tests... e.g. for "A6 Comment - Is this module using XOOPS comment system? test posting" you need to explain exactly what to do so you can be certain that the testing is consistent (all testers need to follow the exact same actions for the results to be valid).
For example:
1. Set permissions to allow anonymous users to post comments.
2. Post a comment as an anonymous user.
3. Reply to your comment as an anonymous user.
4. Set permissions to disallow anonymous users to post comments.
5. Attempt to post a comment as an anonymous user..... etc....

It's boring work writing it all out, but I think it's the only way to guarantee that everyone performs the same test. It will also mean you can give a precise report of why a module didn't pass QA - "testing failed at A6-2: was unable to post as anonymous user"...

I also think there should be a more defined separation between Section 1, which ONLY looks at code, and Section 2, which should ONLY look at how the module functions. So I'd move Smarty/template compliance to Section 1.

I agree with Mith's comment about moving P2 Object Oriented into Section 1, as it's a coding standard and is included in the standards doc I'm working on I also think P1 sql queries test belongs in Section 3 Debug, as you just need to turn on MySQL/Blocks Debug to check that, you don't need to install a sql counter.

I'm not quite sure what you mean with P3 - Ergonomy, or how you intend to test it...?

As Mith said, as a checklist of QA tests it's looking really good. The next step is describing each test in more detail, so testers can provide detailed feedback about the test results... though that's just my opinion, of course

Rowd

19
Mithrandir
Re: Action Item #1 : let's start !

Quote:

MarcoFr wrote:
hervet suggested a QA module !!!!

Sure, but is that really what you need?

Don't you just need to be able to put in various fields and specify whether a test is passed or failed? I'd suggest using a module like Formulize for this.

Quote:

we have to decide which rating model, we use. stars ? or Note ?

I'd suggest a simple pass/fail method for each test

Quote:

Rowd wrote:
As Mith said, as a checklist of QA tests it's looking really good. The next step is describing each test in more detail, so testers can provide detailed feedback about the test results... though that's just my opinion, of course


It is also my opinion:
Quote:

I imagine something along the lines of a document explaining each test, why is it tested (purpose), how is the test performed (procedure), what result is needed for it to pass (expectation) and how this is accomplished in example (explanation)

This document will tell developers how to program and document their module in order to pass the tests.

Now, when the tester has tested the module, using the spreadsheet to note the progress down, he or she takes the above document and adds sections to each test describing the results during the test (observation)


Getting these things up and running should get you started:
1. A checklist of tests
2. An explanatory document, explaining each test and how to perform it
3. A reporting tool - like Formulize - for documenting the test

20
Marco
Re: Action Item #1 : let's start !
  • 2005/3/20 19:39

  • Marco

  • Home away from home

  • Posts: 1256

  • Since: 2004/3/15


hello

just to say, I'll publish a new version of our testing process this week.
thanks for your comments.
marco

Login

Who's Online

167 user(s) are online (89 user(s) are browsing Support Forums)


Members: 0


Guests: 167


more...

Donat-O-Meter

Stats
Goal: $100.00
Due Date: Nov 30
Gross Amount: $0.00
Net Balance: $0.00
Left to go: $100.00
Make donations with PayPal!

Latest GitHub Commits