Update devel/py-urwid to 2.0.1

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

Update devel/py-urwid to 2.0.1

clematis
Hi Team,
Here's a quick update for devel/py-urwid from 1.3.1 to 2.0.1
Build and run ok on amd64 (as dep of upcoming net/toot update).
Comments? OK?
Thanks,
--
clematis (0x7e96fd2400fe7b59)

diff-devel_py-urwid (1K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Update devel/py-urwid to 2.0.1

Klemens Nanni-2
On Thu, Sep 19, 2019 at 12:31:20PM +0200, clematis wrote:
> Here's a quick update for devel/py-urwid from 1.3.1 to 2.0.1
> Build and run ok on amd64 (as dep of upcoming net/toot update).
Thanks for looking into this.

Did you test the existing users of this port?
Did you run the regress suite (for both python flavours)?

Reply | Threaded
Open this post in threaded view
|

Re: Update devel/py-urwid to 2.0.1

clematis
On Thu, Sep 19, 2019 at 01:25:37PM +0200, Klemens Nanni wrote:
> Did you test the existing users of this port?
Tested with one, productivity/khal, built, installed and ran OK.

Note for those who might have manually installed via pip some of these
deps (as root) I would recommand to pip uninstall them before so they are
all properly handle via ports/

> Did you run the regress suite (for both python flavours)?
Yes. It's a little bit noisy. A few depreciation since Python 3.0

platform openbsd6 -- Python 2.7.16, pytest-4.4.0, py-1.8.0, pluggy-0.11.0
30 failed, 153 passed
platform openbsd6 -- Python 3.7.4, pytest-4.4.0, py-1.8.0, pluggy-0.11.0
31 failed, 157 passed, 18 warnings

Thanks,
--
clematis (0x7e96fd2400fe7b59)

Reply | Threaded
Open this post in threaded view
|

Re: Update devel/py-urwid to 2.0.1

Klemens Nanni-2
On Thu, Sep 19, 2019 at 06:00:40PM +0200, clematis wrote:
> Tested with one, productivity/khal, built, installed and ran OK.
Thanks.

> Note for those who might have manually installed via pip some of these
> deps (as root) I would recommand to pip uninstall them before so they are
> all properly handle via ports/
That is neither supported nor encouraged, I suspect doing so will cause
nothing but pain.  Either use pip (as unprivileged user) or get ports
working together.

> platform openbsd6 -- Python 2.7.16, pytest-4.4.0, py-1.8.0, pluggy-0.11.0
> 30 failed, 153 passed
> platform openbsd6 -- Python 3.7.4, pytest-4.4.0, py-1.8.0, pluggy-0.11.0
> 31 failed, 157 passed, 18 warnings
Compared to the results of the in-tree version (py3 flavor), this seems
a lot, although it should not neccessarily block the update:

        Test failed: <unittest.runner.TextTestResult run=292 errors=4 failures=1>

Given this increase, did you look into what's failing in particular?
I'll take a closer look this weekend and commit both this and the toot
update unless I hear objections or stumble across red flags in the
regress suite or issues with other dependent ports.

Reply | Threaded
Open this post in threaded view
|

Re: Update devel/py-urwid to 2.0.1

Kurt Mosiejczuk-9
On Thu, Sep 19, 2019 at 09:16:12PM +0200, Klemens Nanni wrote:
> On Thu, Sep 19, 2019 at 06:00:40PM +0200, clematis wrote:
> > Tested with one, productivity/khal, built, installed and ran OK.
> Thanks.

That does still leave devel/pudb and productivity/py-carddav that have
it as a TEST_DEPENDS. bpython also has it as a RUN_DEPENDS but it
has NO_TEST set.

> > Note for those who might have manually installed via pip some of these
> > deps (as root) I would recommand to pip uninstall them before so they are
> > all properly handle via ports/
> That is neither supported nor encouraged, I suspect doing so will cause
> nothing but pain.  Either use pip (as unprivileged user) or get ports
> working together.

Yes. The only time I've found pip useful in a ports context was when I was
dealing with the web of interlocked dependencies trying to update pytest
to 4.4 and it's dependencies that all depended on each other. Using pip
will mostly just lead to pain.

> > platform openbsd6 -- Python 2.7.16, pytest-4.4.0, py-1.8.0, pluggy-0.11.0
> > 30 failed, 153 passed
> > platform openbsd6 -- Python 3.7.4, pytest-4.4.0, py-1.8.0, pluggy-0.11.0
> > 31 failed, 157 passed, 18 warnings
> Compared to the results of the in-tree version (py3 flavor), this seems
> a lot, although it should not neccessarily block the update:

> Test failed: <unittest.runner.TextTestResult run=292 errors=4 failures=1>

> Given this increase, did you look into what's failing in particular?
> I'll take a closer look this weekend and commit both this and the toot
> update unless I hear objections or stumble across red flags in the
> regress suite or issues with other dependent ports.

Yeah. Given the increase of 1 failure to 30/31 I'd want to know *why* it's
failing before having it committed. Maybe if we knew that all of the
consumers were happy with it, but I'd like to know the reason.

--Kurt

Reply | Threaded
Open this post in threaded view
|

Re: Update devel/py-urwid to 2.0.1

Klemens Nanni-2
In reply to this post by clematis
On Thu, Sep 19, 2019 at 06:00:40PM +0200, clematis wrote:
> Yes. It's a little bit noisy. A few depreciation since Python 3.0
>
> platform openbsd6 -- Python 2.7.16, pytest-4.4.0, py-1.8.0, pluggy-0.11.0
> 30 failed, 153 passed
> platform openbsd6 -- Python 3.7.4, pytest-4.4.0, py-1.8.0, pluggy-0.11.0
> 31 failed, 157 passed, 18 warnings
How did you get these results?

        $ export FLAVOR=python3
        $ make test
        ...
        Ran 296 tests in 4.156s

        FAILED (failures=1, errors=4)
        Test failed: <unittest.runner.TextTestResult run=296 errors=4 failures=1>
        ...

See my full ${WRKDIR}/test.log attached.

test.log (28K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Update devel/py-urwid to 2.0.1

clematis
On Sat, Sep 21, 2019 at 08:50:26PM +0200, Klemens Nanni wrote:

> How did you get these results?
>
> $ export FLAVOR=python3
> $ make test
> ...
> Ran 296 tests in 4.156s
>
> FAILED (failures=1, errors=4)
> Test failed: <unittest.runner.TextTestResult run=296 errors=4 failures=1>
> ...
>
> See my full ${WRKDIR}/test.log attached.
> error: Test failed: <unittest.runner.TextTestResult run=296 errors=4 failures=1>
> /usr/local/lib/python3.7/asyncio/base_events.py:618: ResourceWarning: unclosed event loop <_UnixSelectorEventLoop running=False closed=False debug=False>
> *** Error 1 in . (/usr/ports/lang/python/python.port.mk:239 'do-test': @cd /usr/ports/pobj/py-urwid-2.0.1-python3/urwid-2.0.1 && /usr/bin/en...)

Hi Klemens, Kurt,
Thanks for the feedback. I had a quick irc chat with Klemens who clarified
a few things I was doing wrong in my testing. Thanks again for taking
the time. Really appreciated.

So here we go again: python3 => run=292 errors=0 failures=1

FAIL: test_run (urwid.tests.test_event_loops.AsyncioEventLoopTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File
"/usr/ports/pobj/py-urwid-2.0.1-python3/urwid-2.0.1/urwid/tests/test_event_loops.py",
line 69, in test_run
    self.assertEqual(out, ["clean exit"])
AssertionError: Lists differ: ['clean exit', 'waiting'] != ['clean
exit']

First list contains 1 additional elements.
First extra element 1:
'waiting'

- ['clean exit', 'waiting']
+ ['clean exit']

----------------------------------------------------------------------
Ran 292 tests in 2.898s

FAILED (failures=1)
Test failed: <unittest.runner.TextTestResult run=292 errors=0
failures=1>
error: Test failed: <unittest.runner.TextTestResult run=292 errors=0
failures=1>



python2 tests ALL OK.


PS: I'll re-run tests on all recent ports I've submitted and update each
email thread accordingly. Sorry about that.

Cheers,
--
clematis (0x7e96fd2400fe7b59)