Skip to content

Commit ff8d087

Browse files
committed
Issue #25977: Fix typos in Lib/tokenize.py
Patch by John Walker.
1 parent 3cc8f4b commit ff8d087

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

Lib/tokenize.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -328,8 +328,8 @@ def untokenize(iterable):
328328
Round-trip invariant for full input:
329329
Untokenized source will match input source exactly
330330
331-
Round-trip invariant for limited intput:
332-
# Output bytes will tokenize the back to the input
331+
Round-trip invariant for limited input:
332+
# Output bytes will tokenize back to the input
333333
t1 = [tok[:2] for tok in tokenize(f.readline)]
334334
newcode = untokenize(t1)
335335
readline = BytesIO(newcode).readline
@@ -465,10 +465,10 @@ def open(filename):
465465

466466
def tokenize(readline):
467467
"""
468-
The tokenize() generator requires one argment, readline, which
468+
The tokenize() generator requires one argument, readline, which
469469
must be a callable object which provides the same interface as the
470470
readline() method of built-in file objects. Each call to the function
471-
should return one line of input as bytes. Alternately, readline
471+
should return one line of input as bytes. Alternatively, readline
472472
can be a callable function terminating with StopIteration:
473473
readline = open(myfile, 'rb').__next__ # Example of alternate readline
474474

0 commit comments

Comments
 (0)