]> git.madduck.net Git - etc/vim.git/blob - blib2to3/pgen2/tokenize.pyi

madduck's git repository

Every one of the projects in this repository is available at the canonical URL git://git.madduck.net/madduck/pub/<projectpath> — see each project's metadata for the exact URL.

All patches and comments are welcome. Please squash your changes to logical commits before using git-format-patch and git-send-email to patches@git.madduck.net. If you'd read over the Git project's submission guidelines and adhered to them, I'd be especially grateful.

SSH access, as well as push access can be individually arranged.

If you use my repositories frequently, consider adding the following snippet to ~/.gitconfig and using the third clone URL listed for each project:

[url "git://git.madduck.net/madduck/"]
  insteadOf = madduck:

Remove unnecessary if-statement in maybe_make_parens_invisible_in_atom (#964)
[etc/vim.git] / blib2to3 / pgen2 / tokenize.pyi
1 # Stubs for lib2to3.pgen2.tokenize (Python 3.6)
2 # NOTE: Only elements from __all__ are present.
3
4 from typing import Callable, Iterable, Iterator, List, Optional, Text, Tuple
5 from blib2to3.pgen2.token import *  # noqa
6 from blib2to3.pygram import Grammar
7
8
9 _Coord = Tuple[int, int]
10 _TokenEater = Callable[[int, Text, _Coord, _Coord, Text], None]
11 _TokenInfo = Tuple[int, Text, _Coord, _Coord, Text]
12
13
14 class TokenError(Exception): ...
15 class StopTokenizing(Exception): ...
16
17 def tokenize(readline: Callable[[], Text], tokeneater: _TokenEater = ...) -> None: ...
18
19 class Untokenizer:
20     tokens: List[Text]
21     prev_row: int
22     prev_col: int
23     def __init__(self) -> None: ...
24     def add_whitespace(self, start: _Coord) -> None: ...
25     def untokenize(self, iterable: Iterable[_TokenInfo]) -> Text: ...
26     def compat(self, token: Tuple[int, Text], iterable: Iterable[_TokenInfo]) -> None: ...
27
28 def untokenize(iterable: Iterable[_TokenInfo]) -> Text: ...
29 def generate_tokens(
30     readline: Callable[[], Text],
31     grammar: Optional[Grammar] = ...
32 ) -> Iterator[_TokenInfo]: ...