ENH: raise limit for completion number of columns and warn beyond #35207
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Hi,
Currently the user-completion (ipython "<tab>") on a dataframe with more than 100 columns will silently ignore some columns, letting an unaware user confused on whether data disapeared.
This is actually documented, and due to an arbitrary limit set to workaround a performance issue ( See #18587 )
Dataframe with more than 100 columns are quite common, so this can potentially affect and suprise many users. Therefore, I suggest to increase that limit to, say, 1000. In any case, it would probably be good to warn the user hitting that limit.
The attached quickfix raises the limit to 1000 and adds a warning beyond.
(Note that I didn't experience any completion latency increasing with the axis size, so I'm not sure whether this limit is still relevant in the first place).