-
-
Notifications
You must be signed in to change notification settings - Fork 18.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CI: remove failing line #24092
CI: remove failing line #24092
Conversation
Hello @TomAugspurger! Thanks for submitting the PR.
|
https://travis-ci.org/pandas-dev/pandas/jobs/463403386 maybe picking up a new moto? |
I don't know about the error in travis. We started to check the benchmarks yesterday, and they just run when there are changes to it. So, I guess they were already broken, may be for a while. Happy to ignore the errors in the benchmarks here, I can take a look tonight and fix them in a separate PR, if nobody does it before. |
I'm cleaning them up now. A couple are just broken / weren't updated with removal of deprecations. The HTML repr stuff may be related to how we're running stuff in azure. I think I have a workaround. |
@jreback @datapythonista The ASVs are only properly tested since a few CI runs (after the linebreak thing?), and I guess there are still errors hiding in the code. e.g.:
@TomAugspurger: didn't see you already posted something on that while I was writing... |
Codecov Report
@@ Coverage Diff @@
## master #24092 +/- ##
===========================================
+ Coverage 42.52% 92.2% +49.68%
===========================================
Files 161 161
Lines 51697 51684 -13
===========================================
+ Hits 21982 47657 +25675
+ Misses 29715 4027 -25688
Continue to review full report at Codecov.
|
Codecov Report
@@ Coverage Diff @@
## master #24092 +/- ##
=======================================
Coverage 42.52% 42.52%
=======================================
Files 161 161
Lines 51697 51697
=======================================
Hits 21982 21982
Misses 29715 29715
Continue to review full report at Codecov.
|
return index.get_level_values(i) | ||
|
||
ilabels = list(zip(*[robust_get_level_values(i) for i in subset])) | ||
ilabels = list(zip(*[index._get_level_values(i) for i in subset])) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is OK to do... _sparse_series_to_coo
is the only caller, and it uses _get_level_number(x)
to go from maybe labels to numbers.
@@ -161,15 +161,7 @@ def write_result(self, buf): | |||
_classes.extend(self.classes) | |||
|
|||
if self.notebook: | |||
div_style = '' | |||
try: | |||
import IPython |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The import IPython
was causing a failure. I figure we're OK to drop IPython 2.x compat code :)
Not sure what's up with the SSL failures, but FWIW, |
Well I'm not sure what to do here... It's possible that the network tests are hitting some bug in pytest or something, I'm not sure. We really shouldn't be hitting the network in most unit tests. Do we want to mock these network requests? In a hacky version, we just mock out HTTPResponse's |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Based on the comment, I think the one upstream change that might have caused this recently is pytest 4.0.1
. Let's try pinning pytest<=4.0.0
?
pandas/tests/io/conftest.py
Outdated
@@ -37,6 +37,8 @@ def s3_resource(tips_file, jsonl_file): | |||
""" | |||
pytest.importorskip('s3fs') | |||
boto3 = pytest.importorskip('boto3') | |||
# GH-24092. See if boto.plugin skips the test or fails. | |||
pytest.importorskip("boto.plugin") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For some reason, this line fails instead of skipping. https://travis-ci.org/pandas-dev/pandas/jobs/463496968
That job is using the newest pytest 4.0.1
, but maybe there's a bug in importorskip
for submodules?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Going to try to avoid pinning pytest for now.
Merging on green (I think everything will pass this time). |
try: | ||
pytest.importorskip("boto.plugin") | ||
except AttributeError: | ||
raise pytest.skip("moto/moto error") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wow, that is indeed nasty, haha.
@TomAugspurger
|
Such a mess :) I'm marking the network tests as |
I was just gonna suggest that approach. I found another xdist issue that seems to point to exactly this: |
All green. Merging. |
No description provided.