Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added api refrence #96

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
117 changes: 117 additions & 0 deletions docs/api-ref.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,117 @@
SLY (Sly Lex Yacc) API reference
================================

``sly.yacc``
------------

.. automodule:: sly.yacc
:members:
:show-inheritance:
:inherited-members:

Note: All above members are accessible in root ``sly`` package.


.. autoexception:: sly.yacc.YaccError
:show-inheritance:

.. autoclass:: sly.yacc.SlyLogger
:members:

.. method:: warning(msg, *args, **kwargs)
.. method:: info(msg, *args, **kwargs)
.. method:: debug(msg, *args, **kwargs)
.. method:: error(msg, *args, **kwargs)
.. method:: critical(msg, *args, **kwargs)

log something at the given level.

:param msg: the message to log
:param args: formatting arguments
:param kwargs: by default, these are unused

Note: old-style formatting is used

.. autoclass:: sly.yacc.YaccSymbol
:members:
:show-inheritance:
:inherited-members:

.. autoclass:: sly.yacc.YaccProduction
:members:
:show-inheritance:
:inherited-members:

.. autoclass:: sly.yacc.Production
:members:
:show-inheritance:
:inherited-members:

.. autoclass:: sly.yacc.LRItem
:members:
:show-inheritance:
:inherited-members:

.. autofunction:: sly.yacc.rightmost_terminal

.. autoexception:: sly.yacc.GrammarError
:show-inheritance:

.. autoclass:: sly.yacc.Grammar
:members:
:show-inheritance:
:inherited-members:

.. autofunction:: sly.yacc.digraph

.. autoexception:: sly.yacc.LALRError
:show-inheritance:

.. autoclass:: sly.yacc.LRTable
:members: write
:show-inheritance:
:inherited-members:

``sly.lex``
-----------

.. automodule:: sly.lex
:members:
:show-inheritance:
:inherited-members:

Note: All above members are accessible in root ``sly`` package.

.. autoexception:: sly.lex.LexError
:show-inheritance:

.. autoexception:: sly.lex.PatternError
:show-inheritance:

.. autoexception:: sly.lex.LexerBuildError

.. autoclass:: sly.lex.Token
:members:
:show-inheritance:
:inherited-members:

.. autoclass:: sly.lex.TokenStr
:members:
:show-inheritance:
:inherited-members:


``sly.ast``
-----------

.. autoclass:: sly.ast.AST
:members:
:show-inheritance:
:inherited-members:

``sly.docparse``
----------------

.. automodule:: sly.docparse
:members:
:show-inheritance:
4 changes: 2 additions & 2 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#sys.path.insert(0, os.path.abspath('.'))
sys.path.insert(0, os.path.abspath('..'))

# -- General configuration ------------------------------------------------

Expand All @@ -29,7 +29,7 @@
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = []
extensions = ['sphinx.ext.autodoc']

# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
Expand Down
11 changes: 8 additions & 3 deletions docs/sly.rst
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
.. toctree::
:maxdepth: 1

api-ref

SLY (Sly Lex Yacc)
==================

Expand Down Expand Up @@ -892,7 +897,7 @@ it is to enclose one more more symbols in [ ] like this::
In this case, the value of ``p.item`` is set to ``None`` if the value wasn't supplied.
Otherwise, it will have the value returned by the ``item`` rule below.

You can also encode repetitions. For example, a common construction is a
You can also encode repetitions. For example, a common construction is a
list of comma separated expressions. To parse that, you could write::

@_('expr { COMMA expr }')
Expand All @@ -901,8 +906,8 @@ list of comma separated expressions. To parse that, you could write::

In this example, the ``{ COMMA expr }`` represents zero or more repetitions
of a rule. The value of all symbols inside is now a list. So, ``p.expr1``
is a list of all expressions matched. Note, when duplicate symbol names
appear in a rule, they are distinguished by appending a numeric index as shown.
is a list of all expressions matched. Note, when duplicate symbol names
appear in a rule, they are distinguished by appending a numeric index as shown.

Dealing With Ambiguous Grammars
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Expand Down
10 changes: 5 additions & 5 deletions sly/docparse.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@ class DocParseMeta(type):
'''
Metaclass that processes the class docstring through a parser and
incorporates the result into the resulting class definition. This
allows Python classes to be defined with alternative syntax.
To use this class, you first need to define a lexer and parser:
allows Python classes to be defined with alternative syntax.
To use this class, you first need to define a lexer and parser::

from sly import Lexer, Parser
class MyLexer(Lexer):
Expand All @@ -20,14 +20,14 @@ class MyParser(Parser):

You then need to define a metaclass that inherits from DocParseMeta.
This class must specify the associated lexer and parser classes.
For example:
For example::

class MyDocParseMeta(DocParseMeta):
lexer = MyLexer
parser = MyParser

This metaclass is then used as a base for processing user-defined
classes:
classes::

class Base(metaclass=MyDocParseMeta):
pass
Expand All @@ -38,7 +38,7 @@ class Spam(Base):
...
"""

It is expected that the MyParser() class would return a dictionary.
It is expected that the MyParser() class would return a dictionary.
This dictionary is used to create the final class Spam in this example.
'''

Expand Down
60 changes: 49 additions & 11 deletions sly/lex.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,13 +35,18 @@

import re
import copy
# type hints
from typing import Iterator

class LexError(Exception):
'''
Exception raised if an invalid character is encountered and no default
error handler function is defined. The .text attribute of the exception
contains all remaining untokenized text. The .error_index is the index
location of the error.
error handler function is defined.

.. attribute:: text
all remaining untokenized text
.. attribute:: error_index
the index location of the error.
'''
def __init__(self, message, text, error_index):
self.args = (message,)
Expand Down Expand Up @@ -72,12 +77,23 @@ def __init__(self, newstate, tok=None):
class Token(object):
'''
Representation of a single token.

.. attribute:: type
.. attribute:: value
.. attribute:: lineno
.. attribute:: index
'''
__slots__ = ('type', 'value', 'lineno', 'index')
def __repr__(self):
return f'Token(type={self.type!r}, value={self.value!r}, lineno={self.lineno}, index={self.index})'

class TokenStr(str):
"""
Adds the folowing meta-syntaxes:

* ``TOKEN['value'] = NEWTOKEN``
* ``del TOKEN['value']``
"""
@staticmethod
def __new__(cls, value, key=None, remap=None):
self = super().__new__(cls, value)
Expand Down Expand Up @@ -112,11 +128,11 @@ def __init__(self):
def __setitem__(self, key, value):
if isinstance(value, str):
value = TokenStr(value, key, self.remap)

if isinstance(value, _Before):
self.before[key] = value.tok
value = TokenStr(value.pattern, key, self.remap)

if key in self and not isinstance(value, property):
prior = self[key]
if isinstance(prior, str):
Expand Down Expand Up @@ -181,12 +197,34 @@ def __new__(meta, clsname, bases, attributes):
return cls

class Lexer(metaclass=LexerMeta):
"""
These attributes may be defined in subclasses

.. attribute:: tokens
:type: set[str]

.. attribute:: literals
:type: set[str]

.. attribute:: ignore
:type: str

.. attribute:: reflags
:type: int

.. attribute:: regex_module
:type: module

The regex module to use. Defaults to the standard library's
``re`` module.

"""
# These attributes may be defined in subclasses
tokens = set()
literals = set()
ignore = ''
reflags = 0
regex_module = re
regex_module = re #: :meta hide-value:

_token_names = set()
_token_funcs = {}
Expand Down Expand Up @@ -214,23 +252,23 @@ def _collect_rules(cls):
# Such functions can be created with the @_ decorator or by defining
# function with the same name as a previously defined string.
#
# This function is responsible for keeping rules in order.
# This function is responsible for keeping rules in order.

# Collect all previous rules from base classes
rules = []

for base in cls.__bases__:
if isinstance(base, LexerMeta):
rules.extend(base._rules)

# Dictionary of previous rules
existing = dict(rules)

for key, value in cls._attributes.items():
if (key in cls._token_names) or key.startswith('ignore_') or hasattr(value, 'pattern'):
if callable(value) and not hasattr(value, 'pattern'):
raise LexerBuildError(f"function {value} doesn't have a regex pattern")

if key in existing:
# The definition matches something that already existed in the base class.
# We replace it, but keep the original ordering
Expand Down Expand Up @@ -282,7 +320,7 @@ def _build(cls):
remapped_toks = set()
for d in cls._remapping.values():
remapped_toks.update(d.values())

undefined = remapped_toks - set(cls._token_names)
if undefined:
missing = ', '.join(undefined)
Expand Down Expand Up @@ -357,7 +395,7 @@ def pop_state(self):
'''
self.begin(self.__state_stack.pop())

def tokenize(self, text, lineno=1, index=0):
def tokenize(self, text, lineno=1, index=0) -> "Iterator[Token]":
_ignored_tokens = _master_re = _ignore = _token_funcs = _literals = _remapping = None

# --- Support for state changes
Expand Down
Loading