Before I comment on this (below), I just want to thank you, Joseph, for writing it up formally, even though you disagree with it. Your considered prospective and administrative style are, as always, very appreciated.
Now, some comments:
Joseph Lee wrote:
Thus I would like to propose that when reviewing add-ons for inclusion in community add-ons website, reviewers should perform manifest checks. If an add-onWhile i of course agree with this in the case of code reviews, I will point out that not all reviews are code reviews, and nor are they intended to be as I understand things.
That's why I tried to suggest that the manifest is authoritative in my previous message(s) about this. If the manifest says 2019.2 is last tested, and it works in 2019.2, there is no point in testing it in 2019.3--it is not compatible.
I think it is asking more than is usually asked of reviewers (basic reviewers), to check the code for feature compatibility. Code review is not included in basic review, which as I understand it, includes only docs, UX, security, and license.
So I agree with checking the manifest as a 5th component of basic reviews, to see what versions of NVDA the add-on declares compatibility with. But I am less convinced about having basic reviewers be asked to check the add-on code for "features from a given release"--I.E. elements that do not fall within the compatibility range declared in the manifest.
I think the UX check should be all that is required here, to verify that the manifest's compatibility range is accurate.
In other words: if it declares for 2019.2, and 2017.3, test with the latest (2019.2), and maybe a sampling of prior versions back to 2017.3.
If it uses features in a way it shouldn't, it will fail UX at that point generally, right?
I think to do otherwise will elevate every basic review to a full code review.
Which okay, if you want to do that, I would probably support you, but at that point eliminate the distinction and just call it a review.
But I don't recall that code reviews are currently required for inclusion on the website.
As for when to enforce compatibility range check (minimum version <= current version <= last tested version), I propose January 1, 2020 as start date. ThisAgreed with all that.
If this proposal is adopted by the community, I propose sending out a notice about it several times:Because of the likely timing of 2019.3, I suggest instead October 15th, 2019, thus giving them 78 days if you want to stick with Jan 1 as D day.
That is closer, also, to the spirit of the "if an author can't be contacted after three months" provision.