diff --git a/.gitignore b/.gitignore index c9b568f7..781ee71e 100644 --- a/.gitignore +++ b/.gitignore @@ -1,2 +1,6 @@ *.pyc *.swp +*.egg-info +build/* +dist/* +tests/pyvirtualenv diff --git a/CHANGELOG.md b/CHANGELOG.md index 415763d5..75e69eaa 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,15 +1,28 @@ ## Changelog -25-05-2015 - rel.00: - +25-05-2015 - tag: rel.00: + * Initial release of PyangBind outside of BT. -09-06-2015 - rel.01-alpha.01: +09-06-2015 - tag: rel.01-alpha.01: * Merge of xpath-helper-04 into master. * Support for leafref with XPath lookups. -04-09-2015 - rel.02: +04-09-2015 - tag: rel.02: * Merge of serialiser-11 into master. - * Support for serialising to JSON, extensions, refactored xpathhelper, and a number of new types. \ No newline at end of file + * Support for serialising to JSON, extensions, refactored xpathhelper, and a number of new types. + +31-12-2015 - 0.1.0 + + * Adopt semantic versioning. + * First release packaged for PyPi + +11-01-2016 - 0.1.3 + + * Final test release to PyPI's test repo. + * To be released as 0.2.0. + +11-01-2016 - 0.2.0 + * Released to PyPI. \ No newline at end of file diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md new file mode 100644 index 00000000..33ccaba5 --- /dev/null +++ b/CONTRIBUTING.md @@ -0,0 +1,12 @@ +## Contributing to PyangBind + +Contributions to PyangBind are very welcome, either directly via pull requests, or as feature suggestions or bug reports. + +A couple of requests: + + * Code style is currently intended to be PEP-8 compliant, however, rules E111, E114, E127 and E128 are ignored. The standard indentation in PyangBind code is 2 spaces (**not** 4), and continued lines are made to be subjectively aesthetically pleasing/readable. + * Please run tests/run.sh and check that all the tests pass if you're changing code. New tests are much appreciated. If you'd like to use a different test framework, that's fine -- just please ensure that `TESTNAME/run.py` runs the tests. + * If you have an issue with generated code/odd errors during build -- please do just e-mail this over or open an issue. If you can't share the YANG itself, then anonymised YANG is very welcome. + * If you'd like to discuss the best design for a feature, or don't get how a feature fits in, please open an issue, or send e-mail. + +And most of all, thanks for contributions :-) \ No newline at end of file diff --git a/MANIFEST.in b/MANIFEST.in new file mode 100644 index 00000000..ff6c680a --- /dev/null +++ b/MANIFEST.in @@ -0,0 +1,3 @@ +include pyangbind/plugin/*.py +include *.md +include LICENSE diff --git a/README.md b/README.md index 11cabd7d..06a93403 100644 --- a/README.md +++ b/README.md @@ -11,10 +11,16 @@ Development of **PyangBind** has been motivated by consumption of the [**OpenCon **PyangBind** can be called by using the Pyang ```--plugindir``` argument: ``` -pyang --plugindir /path/to/repo -f pybind +pyang --plugindir -f pybind ``` -Output can be written in two different ways: +By default, the pyangbind plugin will be installed in the `site-packages/pyangbind/plugin` directory, this may be the Python system or user packages. It is possible to determine the plugin's path using + +``` +/usr/bin/env python -c "import pyangbind; import os; print "%s/plugin" % os.path.dirname(pyangbind.__file__)" +``` + +Output is written based on the following options: * if no options given to pyang, it will write to ```stdout``` as per pyang's default behaviour. * if ```-o``` is given to pyang, all output will be written to the file specified as the target fd. @@ -22,9 +28,10 @@ Output can be written in two different ways: Other options can be used on the command-line: -* ```--use-xpath-helper``` - this currently uses the ```YANGPathHelper``` class in ```lib/xpathhelper.py``` to allow registration of objects into an XML tree, and hence allow XPATH references to be resolved ([see the relevant documentation](#leafref-helper)). In the future, arguments to this function will allow the user to load in their own helper function. See the documentation for ```PyangBindXpathHelper``` in ```lib/xpathhelper.py``` for the functions that this class should provide. -* ```--pybind-class-dir=DIR``` - by default PyangBind will assume that in the module directory of any bindings file, there exists a ```lib/``` directory. This provides PyangBind access the relevant YANG type classes. Using this argument specifies an alternate directory which is then appended to the Python path variable, such that the user does not need to create ```lib``` symlinks. If you use ```--split-class-dir``` you *will* want to use this option! -* ```--interesting-extension=EXTENSION-MODULE``` - PyangBind by default will do nothing with extension statements. If a module name (e.g., ```foo-extensions```) is specified on the command line, PyangBind will add entries to an extension dictionary where they are from an interesting module. Multiple modules may be specified. Such entries can then be accessed via ```yang_object.extensions()```. +* ```--use-xpath-helper``` - Use the `path_helper` argument to each PyangBind class. The `path_helper` is used to register objects into an XML tree, and hence allow XPATH references to be resolved ([see the relevant documentation](#leafref-helper)). Whilst a `YANGPathHelper` class (see `pyangbind/lib/xpathhelper.py`) is currently expected, it is possible for an alternative to be utilised. See the documentation for ```PyangBindXpathHelper``` in ```pyangbind/lib/xpathhelper.py``` for the methods that an alternative class should provide. +* ```--interesting-extension=EXTENSION-MODULE``` - PyangBind by default will do nothing with extension statements. If a module name (e.g., ```foo-extensions```) is specified on the command line, PyangBind will add entries to an extension dictionary where they are from an interesting module. Multiple modules may be specified. Such entries can then be accessed via ```yang_object._extensions()```. +* ```--build-rpcs``` - this switch causes pyangbind to read the `rpc` statements that are defined in each YANG module and compile these into a set of classes that are stored in a Python module called ```_rpc```. Each RPC is created with the relevant `input` and `output` statements. Where instances of these classes are created, they do not register against the supplied XPath helper module (due to the fact that RPC instances are not part of the 'standard' data tree). A supplied `path_helper` argument will still be used by the modules to resolve leaf references where required. +* ```--use-extmethods``` - this switches causes a dictionary of the form `{ "path": class }` to be referenced by each PyangBind object. It is then possible to call a set of extended method names (specified [here](https://github.com/robshakir/pyangbind/blob/master/lib/yangtypes.py#L917-L930)) against each object. When these methods are called, the object will look its path up in the supplied dictionary, and if an entry exists, call the corresponding method of the supplied class. This can be used to implement actions against certain classes without needing to change the generated code. Currently, the method names are static - a possible future extension is to make the set of extmethod names dynamic. Once bindings have been generated, each YANG module is included as a top-level class within the output bindings file. Using the following module as an example: @@ -69,7 +76,7 @@ bindings = pyangbind_example() bindings.parent.integer = 12 ``` -Note, that as of release.01, pyangbind expects a ```lib``` directory locally to the ```bindings.py``` file referenced above. This contains the ```xpathhelper``` and ```yangtypes``` modules - such that this code does not need to be duplicated across each ```bindings.py``` file. In the future, it is intended that these functions be provided as an installable module, but this is is still *TODO*. See the commentary relating to ```--pybind-class-dir``` if this sounds irritating :-). +As of version 0.2.0, pyangbind is now packaged such that the helper modules can be installed through ```pip```. This means that all modules expect to be able to import ```pyangbind.lib.```. This version therefore **removes** those switches that were used to hack around the fact that pyangbind was not installable. Each data leaf can be referred to via the path described in the YANG module. Each leaf is represented by the base class that is described in the [type support](#type-support) section of this document. @@ -92,7 +99,7 @@ Each native type is wrapped in a YANGDynClass dynamic type - which provides help * ```default()``` - returns the default value of the YANG leaf. * ```changed()``` - returns ```True``` if the value has been changed from the default, ```False``` otherwise. * ```yang_name()``` - returns the YANG data element's name (since not all data elements name are safe to be used in Python). -* ```extensions()``` - returns the dictionary of interesting extensions. +* ```_extensions()``` - returns the dictionary of interesting extensions. ### YANG Container Methods @@ -103,7 +110,7 @@ In addition, a YANG container provides a set of methods to determine properties * ```elements()``` - which provides a list of the elements of the YANG-described tree which branch from this container. * ```get(filter=False)``` - providing a means to return a Python dictionary hiearchy showing the tree branching from the container node. Where the ```filter``` argument is set to ```True``` only elements that have changed from their default values due to manipulation of the model are returned - when filter is not specified the default values of all leaves are returned. -As of a recent release, iteration through container objects will return a key,value list which can be used to walk through leaf or container objects within a particular container. +Iteration through container objects will return a key,value list which can be used to walk through leaf or container objects within a particular container. ### YANG List Methods @@ -146,10 +153,10 @@ AttributeError: can't set attribute The ```YANGPathHelper``` class in the xpathhelper module provides a lightweight means to be able to establish a tree structure against which pyangbind modules register themselves. In order to enable this behaviour use the ```---with-xpathhelper``` flag during code generation. -When ```--with-xpathhelper``` modules will automatically register themselves against the tree structure that is provided using the ```path_helper``` keyword-argument. ```path_helper``` is expected to be set to an instance of ```lib.xpathhelper.YANGPathHelper``` - which is created independently to the root YANG module, as shown below: +When ```--with-xpathhelper``` modules will automatically register themselves against the tree structure that is provided using the ```path_helper``` keyword-argument. ```path_helper``` is expected to be set to an instance of ```pyangbind.lib.xpathhelper.YANGPathHelper``` - which is created independently to the root YANG module, as shown below: ```python -from lib.xpathhelper import YANGPathHelper +from pyangbind.lib.xpathhelper import YANGPathHelper from oc_bgp import bgp from oc_routing_policy import routing_policy as rpol @@ -247,7 +254,7 @@ As of September 2015 (no current release-tag), PyangBind also provides means to In order to use the JSON serialisation, the custom encoder needs to be imported: ```python -from lib.serialise import pybindJSONEncoder +from pyangbind.lib.serialise import pybindJSONEncoder print json.dumps(pyangbind_obj.get(filter=True), \ cls=pybindJSONEncoder, indent=4) ``` diff --git a/README.rst b/README.rst new file mode 100644 index 00000000..7f498eff --- /dev/null +++ b/README.rst @@ -0,0 +1,38 @@ +PyangBind +========= + +PyangBind is a plugin for pyang which converts YANG data models into a Python class hierarchy, such that Python can be used to manipulate data that conforms with a YANG model. + +This module provides the supporting classes and functions that PyangBind modules utilise, particularly: + +* pyangbind.base.PybindBase - which is the parent class inherited by all container or module YANG objects. + +* pyangbind.pybindJSON - which containers wrapper functions which can be used to help with serialisation of YANG to JSON. + +* pyangbind.serialise.pybindJSONEncoder - a class that can be used as a custom encoder for the JSON module to serialise PyangBind class hierarchies to JSON. + +* pyangbind.serialise.pybindJSONDecoder - a class that can be used as a custom decoder to load JSON-encoded instances of YANG models into a PyangBind class hierarchy. + +* pyangbind.xpathhelper.YANGPathHelper - a class which can have objects registered against it, and subsequently retrieved from it using XPATH expressions. This module also includes parent classes that can be used to implement other helper modules of this nature. + +* pyangbind.yangtypes: The various functions which generate python types that are used to represent YANG types, and some helper methods. + + - pyangbind.yangtypes.is_yang_list and is_yang_leaflist are self explainatory, but may be useful. + + - pyangbind.yangtypes.safe_name is used throughout PyangBind to determine how to map YANG element names into Python attribute names safely. + + - pyangbind.yangtypes.RestrictedPrecisionDecimalType - generates wrapped Decimal types that has a restricted set of decimal digits - i.e., can deal with fraction-digits arguments in YANG. + + - pyangbind.yangtypes.RestrictedClassType - generates types which wrap a 'base' type (e.g., integer) with particular restrictions. The restrictions are supplied as a dictionary, or with specific arguments if single restrictions are required. Currently, the restrictions supported are regexp matches, ranges, lengths, and restrictions to a set of values (provided as keys to a dict). + + - pyangbind.yangtypes.TypedListType - generates types which wrap a list to restrict the objects that it may contain. + + - pyangbind.yangtypes.YANGListType - generates types which wrap a class representing a container, such that it acts as a YANG list. + + - pyangbind.yangtypes.YANGBool - a boolean class. + + - pyangbind.yangtypes.YANGDynClass - generates types which consist of a wrapper (YANGDynClass) and a wrapped object which may be any other class. YANGDynClass is a meta-class that provides additional data on top of the attributes and functions of the wrapped class. + + - pyangbind.yangtypes.ReferenceType - generates types which can use a pyangbind.xpathhelper.PybindXpathHelper instance to look up values - particularly to support leafrefs in YANG. + +Usage documentation for PyangBind itself can be found on GitHub: https://github.com/robshakir/pyangbind \ No newline at end of file diff --git a/lib b/lib new file mode 120000 index 00000000..e2293b3a --- /dev/null +++ b/lib @@ -0,0 +1 @@ +Pyangbind/lib \ No newline at end of file diff --git a/lib/__init__.py b/pyangbind/__init__.py similarity index 100% rename from lib/__init__.py rename to pyangbind/__init__.py diff --git a/pyangbind/lib/__init__.py b/pyangbind/lib/__init__.py new file mode 100644 index 00000000..485f44ac --- /dev/null +++ b/pyangbind/lib/__init__.py @@ -0,0 +1 @@ +__version__ = "0.1.1" diff --git a/lib/base.py b/pyangbind/lib/base.py similarity index 100% rename from lib/base.py rename to pyangbind/lib/base.py diff --git a/lib/pybindJSON.py b/pyangbind/lib/pybindJSON.py similarity index 100% rename from lib/pybindJSON.py rename to pyangbind/lib/pybindJSON.py diff --git a/lib/serialise.py b/pyangbind/lib/serialise.py similarity index 95% rename from lib/serialise.py rename to pyangbind/lib/serialise.py index 88365f8d..50730a7d 100644 --- a/lib/serialise.py +++ b/pyangbind/lib/serialise.py @@ -26,7 +26,7 @@ import numpy from collections import OrderedDict from decimal import Decimal -from yangtypes import safe_name +from pyangbind.lib.yangtypes import safe_name from types import ModuleType import copy @@ -69,20 +69,20 @@ def encode(self, obj): def default(self, obj): def jsonmap(obj, map_val): - if map_val in ["lib.yangtypes.RestrictedClass"]: + if map_val in ["pyangbind.lib.yangtypes.RestrictedClass"]: map_val = getattr(obj, "_restricted_class_base")[0] if map_val in ["numpy.uint8", "numpy.uint16", "numpy.uint32", "numpy.uint64", "numpy.int8", "numpy.int16", "numpy.int32", "numpy.int64"]: return int(obj) - elif map_val in ["lib.yangtypes.ReferencePathType"]: + elif map_val in ["pyangbind.lib.yangtypes.ReferencePathType"]: return self.default(obj._get()) - elif map_val in ["lib.yangtypes.RestrictedPrecisionDecimal"]: + elif map_val in ["pyangbind.lib.yangtypes.RestrictedPrecisionDecimal"]: return float(obj) elif map_val in ["bitarray.bitarray"]: return obj.to01() - elif map_val in ["lib.yangtypes.YANGBool"]: + elif map_val in ["pyangbind.lib.yangtypes.YANGBool"]: if obj: return True else: diff --git a/lib/xpathhelper.py b/pyangbind/lib/xpathhelper.py similarity index 100% rename from lib/xpathhelper.py rename to pyangbind/lib/xpathhelper.py diff --git a/lib/yangtypes.py b/pyangbind/lib/yangtypes.py similarity index 100% rename from lib/yangtypes.py rename to pyangbind/lib/yangtypes.py diff --git a/pyangbind/plugin/pybind.py b/pyangbind/plugin/pybind.py new file mode 100644 index 00000000..c136ee9e --- /dev/null +++ b/pyangbind/plugin/pybind.py @@ -0,0 +1,1522 @@ +""" +Copyright 2015, Rob Shakir (rjs@jive.com, rjs@rob.sh) + +This project has been supported by: + * Jive Communications, Inc. + * BT plc. + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. +""" + +import optparse +import sys +import re +import string +import numpy as np +import decimal +import copy +import os +from bitarray import bitarray +from pyangbind.lib.yangtypes import safe_name, YANGBool + +from pyang import plugin +from pyang import statements + +DEBUG = True +if DEBUG: + import pprint + pp = pprint.PrettyPrinter(indent=2) + +# YANG is quite flexible in terms of what it allows as input to a boolean +# value, this map is used to provide a mapping of these values to the python +# True and False boolean instances. +class_bool_map = { + 'false': False, + 'False': False, + 'true': True, + 'True': True, +} + +class_map = { + # this map is dynamically built upon but defines how we take + # a YANG type and translate it into a native Python class + # along with other attributes that are required for this mapping. + # + # key: the name of the YANG type + # native_type: the Python class that is used to support this + # YANG type natively. + # map (optional): a map to take input values and translate them + # into valid values of the type. + # base_type: whether the class can be used as + # class(*args, **kwargs) in Python, or whether it is a + # derived class (such as is created based on a typedef, + # or for types that cannot be supported natively, such + # as enumeration, or a string + # with a restriction placed on it) + # quote_arg (opt): whether the argument to this class' __init__ needs to + # be quoted (e.g., str("hello")) in the code that is + # output. + # pytype (opt): A reference to the actual type that is used, this is + # used where we infer types, such as for an input + # value to a union since we need to actually compare + # the value against the __init__ method and see whether + # it works. + # parent_type (opt): for "derived" types, then we store what the enclosed + # type is such that we can create instances where + # required e.g., a restricted string will have a + # parent_type of a string. this can be a list if the + # type is a union. + # restriction ...: where the type is a restricted type, then the + # (optional) class_map dict entry can store more information about + # the type of restriction. this is generally used when + # we need to re-initialise an instance of the class, + # such as in the setter methods of containers. + # Other types may add their own types to this dictionary that have + # meaning only for themselves. For example, a ReferenceType can add the + # path it references, and whether the require-instance keyword was set + # or not. + # + 'boolean': { + "native_type": "YANGBool", + "map": class_bool_map, + "base_type": True, + "quote_arg": True, + "pytype": YANGBool + }, + 'binary': { + "native_type": "bitarray", + "base_type": True, + "quote_arg": True, + "pytype": bitarray + }, + 'uint8': { + "native_type": "np.uint8", + "base_type": True, + "pytype": np.uint8 + }, + 'uint16': { + "native_type": "np.uint16", + "base_type": True, + "pytype": np.uint16 + }, + 'uint32': { + "native_type": "np.uint32", + "base_type": True, + "pytype": np.uint32 + }, + 'uint64': { + "native_type": "np.uint64", + "base_type": True, + "pytype": np.uint64}, + 'string': { + "native_type": "unicode", + "base_type": True, + "quote_arg": True, + "pytype": unicode + }, + 'decimal64': { + "native_type": "Decimal", + "base_type": True, + "pytype": decimal.Decimal + }, + 'empty': { + "native_type": "YANGBool", + "map": class_bool_map, + "base_type": True, + "quote_arg": True, + "pytype": YANGBool + }, + 'int8': { + "native_type": "np.int8", + "base_type": True, + "pytype": np.int8 + }, + 'int16': { + "native_type": "np.int16", + "base_type": True, + "pytype": np.int16 + }, + 'int32': { + "native_type": "np.int32", + "base_type": True, + "pytype": np.int32 + }, + 'int64': { + "native_type": "np.int64", + "base_type": True, + "pytype": np.int64 + }, +} + +# We have a set of types which support "range" statements in RFC6020. This +# list determins types that should be allowed to have a "range" argument. +INT_RANGE_TYPES = ["uint8", "uint16", "uint32", "uint64", + "int8", "int16", "int32", "int64"] + + +# Base machinery to support operation as a plugin to pyang. +def pyang_plugin_init(): + plugin.register_plugin(PyangBindClass()) + + +class PyangBindClass(plugin.PyangPlugin): + def add_output_format(self, fmts): + # Add the 'pybind' output format to pyang. + self.multiple_modules = True + fmts['pybind'] = self + + def emit(self, ctx, modules, fd): + # When called, call the build_pyangbind function. + build_pybind(ctx, modules, fd) + + def add_opts(self, optparser): + # Add pyangbind specific operations to pyang. These are documented in the + # options, but are essentially divided into three sets. + # * xpathhelper - How pyangbind should deal with xpath expressions. + # This module is documented in lib/xpathhelper and describes how + # to support registration, updates, and retrieval of xpaths. + # * class output - whether a single file should be created, or whether + # a hierarchy of python modules should be created. The latter is + # preferable when one has large trees being compiled. + # * extensions - support for YANG extensions that pyangbind should look + # for, and add as a dictionary with each element. + optlist = [ + optparse.make_option("--use-xpathhelper", + dest="use_xpathhelper", + action="store_true", + help="""Use the xpathhelper module to + resolve leafrefs"""), + optparse.make_option("--split-class-dir", + metavar="DIR", + dest="split_class_dir", + help="""Split the code output into + multiple directories"""), + optparse.make_option("--interesting-extension", + metavar="EXTENSION-MODULE", + default=[], + action="append", + type=str, + dest="pybind_interested_exts", + help="""A set of extensions that + are interesting and should be + stored with the class. They + can be accessed through the + "extension_dict()" argument. + Multiple arguments can be + specified."""), + optparse.make_option("--use-extmethods", + dest="use_extmethods", + action="store_true", + help="""Allow a path-keyed dictionary + to be used to specify methods + related to a particular class"""), + optparse.make_option("--build-rpcs", + dest="build_rpcs", + action="store_true", + help="""Generate class bindings for + the input and output of RPCs + defined in each module. These + are placed at the root of + each module"""), + ] + g = optparser.add_option_group("pyangbind output specific options") + g.add_options(optlist) + + +# Core function to build the pyangbind output - starting with building the +# dependencies - and then working through the instantiated tree that pyang has +# already parsed. +def build_pybind(ctx, modules, fd): + # Restrict the output of the plugin to only the modules that are supplied + # to pyang. More modules are parsed by pyangbind to resolve typedefs and + # identities. + module_d = {} + for mod in modules: + module_d[mod.arg] = mod + pyang_called_modules = module_d.keys() + + # Bail if there are pyang errors, since this certainly means that the + # pyangbind output will fail - unless these are solely due to imports that + # we provided but then unused. + if len(ctx.errors): + for e in ctx.errors: + if not e[1] in ["UNUSED_IMPORT", "PATTERN_ERROR"]: + sys.stderr.write("FATAL: pyangbind cannot build module that pyang" + + " has found errors with.\n") + sys.exit(127) + + # Build the common set of imports that all pyangbind files needs + ctx.pybind_common_hdr = "" + ctx.pybind_common_hdr += "\n" + + ctx.pybind_common_hdr += "from operator import attrgetter\n" + if ctx.opts.use_xpathhelper: + ctx.pybind_common_hdr += "import pyangbind.lib.xpathhelper as xpathhelper\n" + ctx.pybind_common_hdr += """from pyangbind.lib.yangtypes import """ + ctx.pybind_common_hdr += """RestrictedPrecisionDecimalType, """ + ctx.pybind_common_hdr += """RestrictedClassType, TypedListType\n""" + ctx.pybind_common_hdr += """from pyangbind.lib.yangtypes import YANGBool, """ + ctx.pybind_common_hdr += """YANGListType, YANGDynClass, ReferenceType\n""" + ctx.pybind_common_hdr += """from pyangbind.lib.base import PybindBase\n""" + ctx.pybind_common_hdr += """from decimal import Decimal\n""" + ctx.pybind_common_hdr += """import numpy as np\n""" + ctx.pybind_common_hdr += """from bitarray import bitarray\n""" + + if not ctx.opts.split_class_dir: + fd.write(ctx.pybind_common_hdr) + else: + ctx.pybind_split_basepath = os.path.abspath(ctx.opts.split_class_dir) + if not os.path.exists(ctx.pybind_split_basepath): + os.makedirs(ctx.pybind_split_basepath) + + # Determine all modules, and submodules that are needed, along with the + # prefix that is used for it. We need to ensure that we understand all of the + # prefixes that might be used to reference an identity or a typedef. + all_mods = [] + for module in modules: + local_module_prefix = module.search_one('prefix') + if local_module_prefix is None: + local_module_prefix = \ + module.search_one('belongs-to').search_one('prefix') + if local_module_prefix is None: + raise AttributeError("A module (%s) must have a prefix or parent " + + "module") + local_module_prefix = local_module_prefix.arg + else: + local_module_prefix = local_module_prefix.arg + mods = [(local_module_prefix, module)] + # 'include' statements specify the submodules of the existing module - + # that also need to be parsed. + for i in module.search('include'): + subm = ctx.get_module(i.arg) + if subm is not None: + mods.append((local_module_prefix, subm)) + # 'import' statements specify the other modules that this module will + # reference. + for j in module.search('import'): + mod = ctx.get_module(j.arg) + if mod is not None: + imported_module_prefix = j.search_one('prefix').arg + mods.append((imported_module_prefix, mod)) + modules.append(mod) + all_mods.extend(mods) + + # remove duplicates from the list (same module and prefix) + new_all_mods = [] + for mod in all_mods: + if mod not in new_all_mods: + new_all_mods.append(mod) + all_mods = new_all_mods + + # Build a list of the 'typedef' and 'identity' statements that are included + # in the modules supplied. + defn = {} + for defnt in ['typedef', 'identity']: + defn[defnt] = {} + for m in all_mods: + t = find_definitions(defnt, ctx, m[1], m[0]) + for k in t: + if k not in defn[defnt]: + defn[defnt][k] = t[k] + + # Build the identities and typedefs (these are added to the class_map which + # is globally referenced). + build_identities(ctx, defn['identity']) + build_typedefs(ctx, defn['typedef']) + + # Iterate through the tree which pyang has built, solely for the modules + # that pyang was asked to build + for modname in pyang_called_modules: + module = module_d[modname] + mods = [module] + for i in module.search('include'): + subm = ctx.get_module(i.arg) + if subm is not None: + mods.append(subm) + for m in mods: + children = [ch for ch in module.i_children + if ch.keyword in statements.data_definition_keywords] + get_children(ctx, fd, children, m, m) + + if ctx.opts.build_rpcs: + rpcs = [ch for ch in module.i_children + if ch.keyword == 'rpc'] + # Build RPCs specifically under the module name, since this + # can be used as a proxy for the namespace. + if len(rpcs): + get_children(ctx, fd, rpcs, module, module, register_paths=False, + path="/%s_rpc" % (safe_name(module.arg))) + + +def build_identities(ctx, defnd): + # Build dicionaries which determine how identities work. Essentially, an + # identity is modelled such that it is a dictionary where the keys of that + # dictionary are the valid values for an identityref. + unresolved_idc = {} + for i in defnd: + unresolved_idc[i] = 0 + unresolved_ids = defnd.keys() + error_ids = [] + identity_d = {} + + # The order of an identity being built is important. Find those identities + # that either have no "base" statement, or have a known base statement, and + # queue these to be processed first. + while len(unresolved_ids): + ident = unresolved_ids.pop(0) + base = defnd[ident].search_one('base') + reprocess = False + if base is None and not unicode(ident) in identity_d: + identity_d[unicode(ident)] = {} + else: + # the identity has a base, so we need to check whether it + # exists already + if unicode(base.arg) in identity_d: + base_id = unicode(base.arg) + # if it did, then we can now define the value - we want to + # define it as both the resolved value (i.e., with the prefix) + # and the unresolved value. + if ":" in ident: + prefix, value = ident.split(":") + prefix, value = unicode(prefix), unicode(value) + if value not in identity_d[base_id]: + identity_d[base_id][value] = {} + if value not in identity_d: + identity_d[value] = {} + # check whether the base existed with the prefix that was + # used for this value too, as long as the base_id is not + # already resolved + if ":" not in base_id: + resolved_base = unicode("%s:%s" % (prefix, base_id)) + if resolved_base not in identity_d: + reprocess = True + else: + identity_d[resolved_base][ident] = {} + identity_d[resolved_base][value] = {} + if ident not in identity_d[base_id]: + identity_d[base_id][ident] = {} + if ident not in identity_d: + identity_d[ident] = {} + else: + reprocess = True + + if reprocess: + # Fall-out from the loop of resolving the identity. If we've looped + # around many times, we can't find a base for the identity, which means + # it is invalid. + if unresolved_idc[ident] > 1000: + sys.stderr.write("could not find a match for %s base: %s\n" % + (ident, base.arg)) + error_ids.append(ident) + else: + unresolved_ids.append(ident) + unresolved_idc[ident] += 1 + + # Remove those identities that do not have any members. This would remove + # identities that are solely bases, but have no other members. However, this + # is a problem if particular modules are compiled. + # for potential_identity in identity_d.keys(): + # if len(identity_d[potential_identity]) == 0: + # del identity_d[potential_identity] + + if error_ids: + raise TypeError("could not resolve identities %s" % error_ids) + + # Add entries to the class_map such that this identity can be referenced by + # elements that use this identity ref. + for i in identity_d: + id_type = {"native_type": """RestrictedClassType(base_type=unicode, """ + + """restriction_type="dict_key", """ + + """restriction_arg=%s,)""" % identity_d[i], + "restriction_argument": identity_d[i], + "restriction_type": "dict_key", + "parent_type": "string", + "base_type": False} + class_map[i] = id_type + + +def build_typedefs(ctx, defnd): + # Build the type definitions that are specified within a model. Since + # typedefs are essentially derived from existing types, order of processing + # is important - we need to go through and build the types in order where + # they have a known 'type'. + unresolved_tc = {} + for i in defnd: + unresolved_tc[i] = 0 + unresolved_t = defnd.keys() + error_ids = [] + known_types = class_map.keys() + known_types.append('enumeration') + known_types.append('leafref') + process_typedefs_ordered = [] + while len(unresolved_t): + + t = unresolved_t.pop(0) + base_t = defnd[t].search_one('type') + if base_t.arg == "union": + subtypes = [i for i in base_t.search('type')] + elif base_t.arg == "identityref": + subtypes = [base_t.search_one('base')] + else: + subtypes = [base_t] + + any_unknown = False + for i in subtypes: + if i.arg not in known_types: + any_unknown = True + if not any_unknown: + process_typedefs_ordered.append((t, defnd[t])) + known_types.append(t) + else: + unresolved_tc[t] += 1 + if unresolved_tc[t] > 1000: + # Take a similar approach to the resolution of identities. If we have a + # typedef that has a type in it that is not found after many iterations + # then we should bail. + error_ids.append(t) + sys.stderr.write("could not find a match for %s type -> %s\n" % + (t, [i.arg for i in subtypes])) + else: + unresolved_t.append(t) + + if error_ids: + raise TypeError("could not resolve typedefs %s" % error_ids) + + # Process the types that we built above. + for i_tuple in process_typedefs_ordered: + item = i_tuple[1] + type_name = i_tuple[0] + mapped_type = False + restricted_arg = False + # Copy the class_map entry - this is done so that we do not alter the + # existing instance in memory as we add to it. + cls, elemtype = copy.deepcopy(build_elemtype(ctx, item.search_one('type'))) + known_types = class_map.keys() + # Enumeration is a native type, but is not natively supported + # in the class_map, and hence we append it here. + known_types.append("enumeration") + known_types.append("leafref") + + # Don't allow duplicate definitions of types + if type_name in known_types: + raise TypeError("Duplicate definition of %s" % type_name) + default_stmt = item.search_one('default') + + # 'elemtype' is a list when the type includes a union, so we need to go + # through and build a type definition that supports multiple types. + if not isinstance(elemtype, list): + restricted = False + # Map the original type to the new type, parsing the additional arguments + # that may be specified, for example, a new default, a pattern that must + # be matched, or a length (stored in the restriction_argument, and + # restriction_type class_map variables). + class_map[type_name] = {"base_type": False} + class_map[type_name]["native_type"] = elemtype["native_type"] + if "parent_type" in elemtype: + class_map[type_name]["parent_type"] = elemtype["parent_type"] + else: + yang_type = item.search_one('type').arg + if yang_type not in known_types: + raise TypeError("typedef specified a native type that was not " + + "supported") + class_map[type_name]["parent_type"] = yang_type + if default_stmt is not None: + class_map[type_name]["default"] = default_stmt.arg + if "referenced_path" in elemtype: + class_map[type_name]["referenced_path"] = elemtype["referenced_path"] + class_map[type_name]["class_override"] = "leafref" + if "require_instance" in elemtype: + class_map[type_name]["require_instance"] = elemtype["require_instance"] + if "restriction_type" in elemtype: + class_map[type_name]["restriction_type"] = \ + elemtype["restriction_type"] + class_map[type_name]["restriction_argument"] = \ + elemtype["restriction_argument"] + if "quote_arg" in elemtype: + class_map[type_name]["quote_arg"] = elemtype["quote_arg"] + else: + # Handle a typedef that is a union - extended the class_map arguments + # to be a list that is parsed by the relevant dynamic type generation + # function. + native_type = [] + parent_type = [] + default = False if default_stmt is None else default_stmt.arg + for i in elemtype: + if isinstance(i[1]["native_type"], list): + native_type.extend(i[1]["native_type"]) + else: + native_type.append(i[1]["native_type"]) + if i[1]["yang_type"] in known_types: + parent_type.append(i[1]["yang_type"]) + else: + msg = "typedef in a union specified a native type that was not" + msg += "supported (%s in %s)" % (i[1]["yang_type"], item.arg) + raise TypeError(msg) + if "default" in i[1] and not default: + # When multiple 'default' values are specified within a union that + # is within a typedef, then pyangbind will choose the first one. + q = True if "quote_arg" in i[1] else False + default = (i[1]["default"], q) + class_map[type_name] = {"native_type": native_type, "base_type": False, + "parent_type": parent_type} + if default: + class_map[type_name]["default"] = default[0] + class_map[type_name]["quote_default"] = default[1] + + +def find_child_definitions(obj, defn, prefix, definitions): + for i in obj.search(defn): + if i.arg in definitions: + sys.stderr.write("WARNING: duplicate definition of %s" % i.arg) + else: + definitions["%s:%s" % (prefix, i.arg)] = i + definitions[i.arg] = i + + for ch in obj.search('grouping'): + if ch.i_children: + find_child_definitions(ch, defn, prefix, definitions) + + return definitions + + +def find_definitions(defn, ctx, module, prefix): + # Find the statements within a module that map to a particular type of + # statement, for instance - find typedefs, or identities, and reutrn them + # as a dictionary to the calling function. + mod = ctx.get_module(module.arg) + if mod is None: + raise AttributeError("expected to be able to find module %s, " % + (module.arg) + "but could not") + definitions = {} + defin = find_child_definitions(mod, defn, prefix, definitions) + return defin + + +def get_children(ctx, fd, i_children, module, parent, path=str(), + parent_cfg=True, choice=False, register_paths=True): + # Iterative function that is called for all elements that have childen + # data nodes in the tree. This function resolves those nodes into the + # relevant leaf, or container/list configuration and outputs the python + # code that corresponds to it to the relevant file. parent_cfg is used to + # ensure that where a parent container was set to config false, this is + # inherited by all elements below it; and choice is used to store whether + # these leaves are within a choice or not. + used_types, elements = [], [] + choices = False + + # When pyangbind was asked to split classes, then we need to create the + # relevant directories for the modules to be created into. In this case + # even though fd might be a valid file handle, we ignore it. + if ctx.opts.split_class_dir: + if path == "": + fpath = ctx.pybind_split_basepath + "/__init__.py" + else: + pparts = path.split("/") + npath = "/" + + # Check that we don't have the problem of containers that are nested + # with the same name + for i in range(1, len(pparts)): + if i > 0 and pparts[i] == pparts[i - 1]: + pname = safe_name(pparts[i]) + "_" + elif i == 1 and pparts[i] == module.arg: + pname = safe_name(pparts[i]) + "_" + else: + pname = safe_name(pparts[i]) + npath += pname + "/" + + bpath = ctx.pybind_split_basepath + npath + if not os.path.exists(bpath): + os.makedirs(bpath) + fpath = bpath + "/__init__.py" + if not os.path.exists(fpath): + try: + nfd = open(fpath, 'w') + except IOError, m: + raise IOError("could not open pyangbind output file (%s)" % m) + nfd.write(ctx.pybind_common_hdr) + else: + try: + nfd = open(fpath, 'a') + except IOError, w: + raise IOError("could not open pyangbind output file (%s)" % m) + else: + # If we weren't asked to split the files, then just use the file handle + # provided. + nfd = fd + + if parent_cfg: + # The first time we find a container that has config false set on it + # then we need to hand this down the tree - we don't need to look if + # parent_cfg has already been set to False as we need to inherit. + parent_config = parent.search_one('config') + if parent_config is not None: + parent_config = parent_config.arg + if parent_config.upper() == "FALSE": + # this container is config false + parent_cfg = False + + # When we are asked to split the classes into modules, then we need to find + # all elements that have their own class within this container, and make sure + # that they are imported. Additionally, we need to find the elements that are + # within a case, and ensure that these are built with the corresponding + # choice specified. + if ctx.opts.split_class_dir: + import_req = [] + + for ch in i_children: + if ch.keyword == "choice": + for choice_ch in ch.i_children: + # these are case statements + for case_ch in choice_ch.i_children: + elements += get_element(ctx, fd, case_ch, module, parent, + path + "/" + ch.arg, parent_cfg=parent_cfg, + choice=(ch.arg, choice_ch.arg), register_paths=register_paths) + else: + elements += get_element(ctx, fd, ch, module, parent, path + "/" + ch.arg, + parent_cfg=parent_cfg, choice=choice, register_paths=register_paths) + + if ctx.opts.split_class_dir: + if hasattr(ch, "i_children") and len(ch.i_children): + import_req.append(ch.arg) + + # Write out the import statements if needed. + if ctx.opts.split_class_dir: + if len(import_req): + for im in import_req: + if im == parent.arg: + im += "_" + nfd.write("""import %s\n""" % safe_name(im)) + + # 'container', 'module', 'list' and 'submodule' all have their own classes + # generated. + if parent.keyword in ["container", "module", "list", "submodule", "input", + "output", "rpc"]: + if ctx.opts.split_class_dir: + nfd.write("class %s(PybindBase):\n" % safe_name(parent.arg)) + else: + if not path == "": + nfd.write("class yc_%s_%s_%s(PybindBase):\n" % (safe_name(parent.arg), + safe_name(module.arg), safe_name(path.replace("/", "_")))) + else: + nfd.write("class %s(PybindBase):\n" % safe_name(parent.arg)) + + # If the container is actually a list, then determine what the key value + # is and store this such that we can give a hint. + keyval = False + if parent.keyword == "list": + keyval = parent.search_one('key').arg if parent.search_one('key') \ + is not None else False + if keyval and " " in keyval: + keyval = keyval.split(" ") + else: + keyval = [keyval] + + # Auto-generate a docstring based on the description that is provided in + # the YANG module. This aims to provide readability to someone perusing the + # code that is generated. + parent_descr = parent.search_one('description') + if parent_descr is not None: + parent_descr = "\n\n YANG Description: %s" % \ + parent_descr.arg.decode('utf8').encode('ascii', 'ignore') + else: + parent_descr = "" + + # Add more helper text. + nfd.write(''' """ + This class was auto-generated by the PythonClass plugin for PYANG + from YANG module %s - based on the path %s. Each member element of + the container is represented as a class variable - with a specific + YANG type.%s + """\n''' % (module.arg, (path if not path == "" else "/%s" % parent.arg), + parent_descr)) + else: + raise TypeError("unhandled keyword with children %s" % parent.keyword) + + elements_str = "" + if len(elements) == 0: + nfd.write(" pass\n") + else: + # We want to prevent a user from creating new attributes on a class that + # are not allowed within the data model - this uses the __slots__ magic + # variable of the class to restrict anyone from adding to these classes. + # Doing so gives an AttributeError when a user tries to specify something + # that was not in the model. + elements_str = "_pyangbind_elements = {" + slots_str = " __slots__ = ('_pybind_generated_by', '_path_helper'," + slots_str += " '_yang_name', '_extmethods', " + for i in elements: + slots_str += "'__%s'," % i["name"] + elements_str += "'%s': %s, " % (i["name"], i["name"]) + slots_str += ")\n" + elements_str += "}\n" + nfd.write(slots_str + "\n") + # Store the real name of the element - since we often get values that are + # not allowed in python as identifiers, but we need the real-name when + # creating instance documents (e.g., peer-group is not valid due to '-'). + nfd.write(" _yang_name = '%s'\n" % (parent.arg)) + + choices = {} + choice_attrs = [] + classes = {} + for i in elements: + # Loop through the elements and build a string that corresponds to the + # class that is going to be created. In all cases (thus far) this uses + # the YANGDynClass helper function to generate a dynamic type. This + # can extend the base type that is provided, and does this to give us + # some attributes that base classes such as int(), or str() don't have - + # but YANG needs (such as a default value, the original YANG name, any + # extension that were provided with the leaf, etc.). + class_str = {} + if "default" in i and not i["default"] is None: + default_arg = "\"%s\"" % (i["default"]) if i["quote_arg"] else "%s" \ + % i["default"] + + if i["class"] == "leaf-list": + # Map a leaf-list to the type specified in the class map. This is a + # TypedList (see lib.yangtypes) with a particular set of types allowed. + class_str["name"] = "__%s" % (i["name"]) + class_str["type"] = "YANGDynClass" + class_str["arg"] = "base=" + if isinstance(i["type"]["native_type"][1], list): + allowed_type = "[" + for subtype in i["type"]["native_type"][1]: + allowed_type += "%s," % subtype + allowed_type += "]" + else: + allowed_type = "%s" % (i["type"]["native_type"][1]) + class_str["arg"] += "%s(allowed_type=%s)" % \ + (i["type"]["native_type"][0], allowed_type) + if "default" in i and not i["default"] is None: + class_str["arg"] += ", default=%s(%s)" % (i["defaulttype"], + default_arg) + elif i["class"] == "list": + # Map a list to YANGList class - this is dynamically derived by the + # YANGListType function to have the relevant characteristics, such as + # whether it is ordered by the user. + class_str["name"] = "__%s" % (i["name"]) + class_str["type"] = "YANGDynClass" + class_str["arg"] = "base=YANGListType(" + class_str["arg"] += "%s,%s" % ("\"%s\"" % i["key"] if i["key"] + else False, i["type"]) + class_str["arg"] += ", yang_name=\"%s\", parent=self" \ + % (i["yang_name"]) + class_str["arg"] += ", is_container='list', user_ordered=%s" \ + % i["user_ordered"] + class_str["arg"] += ", path_helper=self._path_helper" + if i["choice"]: + class_str["arg"] += ", choice=%s" % repr(choice) + class_str["arg"] += ")" + elif i["class"] == "union" or i["class"] == "leaf-union": + # A special mapped type where there is a union that just includes + # leaves this is mapped to a particular Union type, and valid types + # within it. The dynamically generated class will determine whether + # the input can be mapped to the types included in the union. + class_str["name"] = "__%s" % (i["name"]) + class_str["type"] = "YANGDynClass" + class_str["arg"] = "base=[" + for u in i["type"][1]: + if isinstance(u[1]["native_type"], list): + for su_native_type in u[1]["native_type"]: + class_str["arg"] += "%s," % su_native_type + else: + class_str["arg"] += "%s," % u[1]["native_type"] + class_str["arg"] += "]" + if "default" in i and not i["default"] is None: + class_str["arg"] += ", default=%s(%s)" % (i["defaulttype"], + default_arg) + elif i["class"] == "leafref": + # A leafref, pyangbind uses the special ReferenceType which performs a + # lookup against the path_helper class provided. + class_str["name"] = "__%s" % (i["name"]) + class_str["type"] = "YANGDynClass" + class_str["arg"] = "base=%s" % i["type"] + class_str["arg"] += "(referenced_path='%s'" % i["referenced_path"] + class_str["arg"] += ", caller=self._path() + ['%s'], " \ + % (i["yang_name"]) + class_str["arg"] += "path_helper=self._path_helper, " + class_str["arg"] += "require_instance=%s)" % (i["require_instance"]) + elif i["class"] == "leafref-list": + # Deal with the special case of a list of leafrefs, since the + # ReferenceType has different arguments that need to be provided to the + # class to properly initialise. + class_str["name"] = "__%s" % (i["name"]) + class_str["type"] = "YANGDynClass" + class_str["arg"] = "base=%s" % i["type"]["native_type"][0] + class_str["arg"] += "(allowed_type=%s(referenced_path='%s'," \ + % (i["type"]["native_type"][1]["native_type"], + i["type"]["native_type"][1]["referenced_path"]) + class_str["arg"] += "caller=self._path() + ['%s'], " % i["yang_name"] + class_str["arg"] += "path_helper=self._path_helper, " + class_str["arg"] += "require_instance=%s))" % \ + (i["type"]["native_type"][1]["require_instance"]) + else: + # Generically handle all other classes with the 'standard' mappings. + class_str["name"] = "__%s" % (i["name"]) + class_str["type"] = "YANGDynClass" + if isinstance(i["type"], list): + class_str["arg"] = "base=[" + for u in i["type"]: + class_str["arg"] += "%s," % u + class_str["arg"] += "]" + else: + class_str["arg"] = "base=%s" % i["type"] + if "default" in i and not i["default"] is None: + class_str["arg"] += ", default=%s(%s)" % (i["defaulttype"], + default_arg) + if i["class"] == "container": + class_str["arg"] += ", is_container='container'" + elif i["class"] == "list": + class_str["arg"] += ", is_container='list'" + elif i["class"] == "leaf-list": + class_str["arg"] += ", is_leaf=False" + else: + class_str["arg"] += ", is_leaf=True" + if class_str["arg"]: + class_str["arg"] += ", yang_name=\"%s\"" % i["yang_name"] + class_str["arg"] += ", parent=self" + if i["choice"]: + class_str["arg"] += ", choice=%s" % repr(i["choice"]) + choice_attrs.append(i["name"]) + if not i["choice"][0] in choices: + choices[i["choice"][0]] = {} + if not i["choice"][1] in choices[i["choice"][0]]: + choices[i["choice"][0]][i["choice"][1]] = [] + choices[i["choice"][0]][i["choice"][1]].append(i["name"]) + class_str["arg"] += ", path_helper=self._path_helper" + class_str["arg"] += ", extmethods=self._extmethods" + class_str["arg"] += ", register_paths=%s" % i["register_paths"] + if "extensions" in i: + class_str["arg"] += ", extensions=%s" % i["extensions"] + if keyval and i["yang_name"] in keyval: + class_str["arg"] += ", is_keyval=True" + classes[i["name"]] = class_str + + # TODO: get and set methods currently have errors that are reported that + # are a bit ugly. The intention here is to act like an immutable type - + # such that new class instances are created each time that the value is + # set. + + # Generic class __init__, set up the path_helper if asked to. + nfd.write(""" + _pybind_generated_by = 'container' + + def __init__(self, *args, **kwargs):\n""") + if ctx.opts.use_xpathhelper: + nfd.write(""" + helper = kwargs.pop("path_helper", None) + if helper is False: + self._path_helper = False + elif helper is not None and isinstance(helper, xpathhelper.YANGPathHelper): + self._path_helper = helper + elif hasattr(self, "_parent"): + helper = getattr(self._parent, "_path_helper", False) + self._path_helper = helper + else: + self._path_helper = False\n""") + else: + nfd.write(""" + self._path_helper = False\n""") + + if ctx.opts.use_extmethods: + nfd.write(""" + extmethods = kwargs.pop("extmethods", None) + if extmethods is False: + self._extmethods = False + elif extmethods is not None and isinstance(extmethods, dict): + self._extmethods = extmethods + elif hasattr(self, "_parent"): + extmethods = getattr(self._parent, "_extmethods", None) + self._extmethods = extmethods + else: + self._extmethods = False\n""") + else: + nfd.write(""" + self._extmethods = False\n""") + + # Write out the classes that are stored locally as self.__foo where + # foo is the safe YANG name. + for c in classes: + nfd.write(" self.%s = %s(%s)\n" % (classes[c]["name"], + classes[c]["type"], classes[c]["arg"])) + # Don't accept arguments to a container/list/submodule class + nfd.write(""" + if args: + if len(args) > 1: + raise TypeError("cannot create a YANG container with >1 argument") + all_attr = True + for e in self._pyangbind_elements: + if not hasattr(args[0], e): + all_attr = False + break + if not all_attr: + raise ValueError("Supplied object did not have the correct attributes") + for e in self._pyangbind_elements: + setmethod = getattr(self, "_set_%s" % e) + setmethod(getattr(args[0], e))\n""") + + # A generic method to provide a path() method on each container, that gives + # a path in the form of a list that describes the nodes in the hierarchy. + nfd.write(""" + def _path(self): + if hasattr(self, "_parent"): + return self._parent._path()+[self._yang_name] + else: + return %s\n""" % path.split("/")[1:]) + node = {} + + # For each element, write out a getter and setter method - with the doc + # string of the element within the model. + for i in elements: + c_str = classes[i["name"]] + description_str = "" + if i["description"]: + description_str = "\n\n YANG Description: %s" \ + % i["description"].decode('utf-8').encode('ascii', 'ignore') + nfd.write(''' + def _get_%s(self): + """ + Getter method for %s, mapped from YANG variable %s (%s)%s + """ + return self.__%s + ''' % (i["name"], i["name"], i["path"], i["origtype"], + description_str, i["name"])) + + nfd.write(''' + def _set_%s(self,v): + """ + Setter method for %s, mapped from YANG variable %s (%s) + If this variable is read-only (config: false) in the + source YANG file, then _set_%s is considered as a private + method. Backends looking to populate this variable should + do so via calling thisObj._set_%s() directly.%s + """''' % (i["name"], i["name"], i["path"], + i["origtype"], i["name"], i["name"], + description_str)) + nfd.write(""" + try: + t = %s(v,%s)""" % (c_str["type"], c_str["arg"])) + nfd.write(""" + except (TypeError, ValueError): + raise ValueError(\"\"\"%s must be of a type compatible with %s\"\"\") + self.__%s = t\n""" % (i["name"], c_str["arg"], i["name"])) + nfd.write(" if hasattr(self, '_set'):\n") + nfd.write(" self._set()\n") + + # When we want to return a value to its default, the unset method can + # be used. Generally, this is done in a choice where one branch needs to + # be set to the default, but may be used wherever re-initialiation of + # the object is required. + nfd.write(""" + def _unset_%s(self): + self.__%s = %s(%s)\n\n""" % (i["name"], i["name"], + c_str["type"], c_str["arg"],)) + + # When an element is read-only, write out the _set and _get methods, but + # we don't actually make the property object accessible. This ensures that + # where backends are populating the model, then they can do so via the + # _set_X method - but a 'normal' user can't just do container.X = 10. + for i in elements: + rw = True + if not i["config"]: + rw = False + elif not parent_cfg: + rw = False + elif keyval and i["yang_name"] in keyval: + rw = False + + if not rw: + nfd.write(""" %s = property(_get_%s)\n""" % (i["name"], i["name"])) + else: + nfd.write(""" %s = property(_get_%s, _set_%s)\n""" % + (i["name"], i["name"], i["name"])) + nfd.write("\n") + + # Store a list of the choices that are included within this module such that + # we can enforce each brnahc. + if choices: + nfd.write(" __choices__ = %s" % repr(choices)) + nfd.write("""\n %s\n""" % elements_str) + nfd.write("\n") + + if ctx.opts.split_class_dir: + nfd.close() + + return None + + +def build_elemtype(ctx, et, prefix=False): + # Build a dictionary which defines the type for the element. This is used + # both in the case that a typedef needs to be built, as well as on per-list + # basis. + cls = None + pattern_stmt = et.search_one('pattern') if not et.search_one('pattern') \ + is None else False + range_stmt = et.search_one('range') if not et.search_one('range') \ + is None else False + length_stmt = et.search_one('length') if not et.search_one('length') \ + is None else False + + # Determine whether there are any restrictions that are placed on this leaf, + # and build a dictionary of the different restrictions to be placed on the + # type. + restrictions = {} + if pattern_stmt: + restrictions['pattern'] = pattern_stmt.arg + + if length_stmt: + if "|" in length_stmt.arg: + restrictions['length'] = [i.replace(' ', '') for i in + length_stmt.arg.split("|")] + else: + restrictions['length'] = [length_stmt.arg] + + if range_stmt: + # Complex ranges are separated by pipes + if "|" in range_stmt.arg: + restrictions['range'] = [i.replace(' ', '') for i in + range_stmt.arg.split("|")] + else: + restrictions['range'] = [range_stmt.arg] + + # Build RestrictedClassTypes based on the compiled dictionary and the + # underlying base type. + if len(restrictions): + if 'length' in restrictions or 'pattern' in restrictions: + cls = "restricted-%s" % (et.arg) + elemtype = { + "native_type": + """RestrictedClassType(base_type=%s, restriction_dict=%s)""" + % (class_map[et.arg]["native_type"], repr(restrictions)), + "restriction_dict": restrictions, + "parent_type": et.arg, + "base_type": False, + } + elif 'range' in restrictions: + cls = "restricted-%s" % et.arg + elemtype = { + "native_type": + """RestrictedClassType(base_type=%s, restriction_dict=%s)""" + % (class_map[et.arg]["native_type"], repr(restrictions)), + "restriction_dict": restrictions, + "parent_type": et.arg, + "base_type": False, + } + + # Handle all other types of leaves that are not restricted classes. + if cls is None: + cls = "leaf" + # Enumerations are built as RestrictedClasses where the value that is + # provided to the class is check against the keys of a dictionary. + if et.arg == "enumeration": + enumeration_dict = {} + for enum in et.search('enum'): + enumeration_dict[unicode(enum.arg)] = {} + val = enum.search_one('value') + if val is not None: + enumeration_dict[unicode(enum.arg)]["value"] = int(val.arg) + elemtype = {"native_type": """RestrictedClassType(base_type=unicode, \ + restriction_type="dict_key", \ + restriction_arg=%s,)""" % + (enumeration_dict), + "restriction_argument": enumeration_dict, + "restriction_type": "dict_key", + "parent_type": "string", + "base_type": False} + # Map decimal64 to a RestrictedPrecisionDecimalType - this is there to + # ensure that the fraction-digits argument can be implemented. Note that + # fraction-digits is a mandatory argument. + elif et.arg == "decimal64": + fd_stmt = et.search_one('fraction-digits') + if fd_stmt is not None: + cls = "restricted-decimal64" + elemtype = {"native_type": + """RestrictedPrecisionDecimalType(precision=%s)""" % + fd_stmt.arg, "base_type": False, + "parent_type": "decimal64"} + else: + elemtype = class_map[et.arg] + # Handle unions - build a list of the supported types that are under the + # union. + elif et.arg == "union": + elemtype = [] + for uniontype in et.search('type'): + elemtype_s = copy.deepcopy(build_elemtype(ctx, uniontype)) + elemtype_s[1]["yang_type"] = uniontype.arg + elemtype.append(elemtype_s) + cls = "union" + # Map leafrefs to a ReferenceType, handling the referenced path, and + # whether require-instance is set. When xpathhelper is not specified, then + # no such mapping is done - at this point, we solely map to a string. + elif et.arg == "leafref": + path_stmt = et.search_one('path') + if path_stmt is None: + raise ValueError("leafref specified with no path statement") + require_instance = \ + class_bool_map[et.search_one('require-instance').arg] \ + if et.search_one('require-instance') \ + is not None else False + if ctx.opts.use_xpathhelper: + elemtype = {"native_type": "ReferenceType", + "referenced_path": path_stmt.arg, + "parent_type": "string", + "base_type": False, + "require_instance": require_instance} + cls = "leafref" + else: + elemtype = { + "native_type": "unicode", + "parent_type": "string", + "base_type": False, + } + # Handle identityrefs, but check whether there is a valid base where this + # has been specified. + elif et.arg == "identityref": + base_stmt = et.search_one('base') + if base_stmt is None: + raise ValueError("identityref specified with no base statement") + try: + elemtype = class_map[base_stmt.arg] + except KeyError: + sys.stderr.write("FATAL: identityref with an unknown base\n") + if DEBUG: + pp.pprint(class_map.keys()) + pp.pprint(et.arg) + pp.pprint(base_stmt.arg) + sys.exit(127) + else: + # For all other cases, then we should be able to look up directly in the + # class_map for the defined type, since these are not 'derived' types + # at this point. In the case that we are referencing a type that is a + # typedef, then this has been added to the class_map. + try: + elemtype = class_map[et.arg] + except KeyError: + passed = False + if prefix: + try: + tmp_name = "%s:%s" % (prefix, et.arg) + elemtype = class_map[tmp_name] + passed = True + except: + pass + if passed is False: + sys.stderr.write("FATAL: unmapped type (%s)\n" % (et.arg)) + if DEBUG: + pp.pprint(class_map.keys()) + pp.pprint(et.arg) + pp.pprint(prefix) + sys.exit(127) + if isinstance(elemtype, list): + cls = "leaf-union" + elif "class_override" in elemtype: + # this is used to propagate the fact that in some cases the + # native type needs to be dynamically built (e.g., leafref) + cls = elemtype["class_override"] + return (cls, elemtype) + + +def find_absolute_default_type(default_type, default_value, elemname): + if not isinstance(default_type, list): + return default_type + + for i in default_type: + if not i[1]["base_type"]: + test_type = class_map[i[1]["parent_type"]] + else: + test_type = i[1] + try: + tmp = test_type["pytype"](default_value) + default_type = test_type + break + except ValueError: + pass + return find_absolute_default_type(default_type, default_value, elemname) + + +def get_element(ctx, fd, element, module, parent, path, + parent_cfg=True, choice=False, register_paths=True): + # Handle mapping of an invidual element within the model. This function + # produces a dictionary that can then be mapped into the relevant code that + # dynamically generates a class. + + this_object = [] + default = False + has_children = False + create_list = False + + elemdescr = element.search_one('description') + if elemdescr is None: + elemdescr = False + else: + elemdescr = elemdescr.arg + + # If the element has an i_children attribute then this is a container, list + # leaf-list or choice. Alternatively, it can be the 'input' or 'output' + # substmts of an RPC + if hasattr(element, 'i_children'): + if element.keyword in ["container", "list", "input", "output"]: + has_children = True + elif element.keyword in ["leaf-list"]: + create_list = True + + # Fixup the path when within a choice, because this iteration belives that + # we are under a new container, but this does not exist in the path. + if element.keyword in ["choice"]: + path_parts = path.split("/") + npath = "" + for i in range(0, len(path_parts) - 1): + npath += "%s/" % path_parts[i] + npath.rstrip("/") + else: + npath = path + + # Create an element for a container. + if element.i_children: + chs = element.i_children + get_children(ctx, fd, chs, module, element, npath, parent_cfg=parent_cfg, + choice=choice, register_paths=register_paths) + + elemdict = { + "name": safe_name(element.arg), "origtype": element.keyword, + "class": element.keyword, + "path": safe_name(npath), "config": True, + "description": elemdescr, + "yang_name": element.arg, + "choice": choice, + "register_paths": register_paths, + } + # Handle the different cases of class name, this depends on whether we + # were asked to split the bindings into a directory structure or not. + if ctx.opts.split_class_dir: + # If we were dealing with split classes, then rather than naming the + # class based on a unique intra-file name - and rather we must import + # the relative path to the module.class + if element.arg == parent.arg: + modname = safe_name(element.arg) + "_" + else: + modname = safe_name(element.arg) + elemdict["type"] = "%s.%s" % (modname, safe_name(element.arg)) + + else: + # Otherwise, give a unique name for the class within the dictionary. + elemdict["type"] = "yc_%s_%s_%s" % (safe_name(element.arg), + safe_name(module.arg), + safe_name(path.replace("/", "_"))) + + # Deal with specific cases for list - such as the key and how it is + # ordered. + if element.keyword == "list": + elemdict["key"] = safe_name(element.search_one("key").arg) \ + if element.search_one("key") is not None else False + user_ordered = element.search_one('ordered-by') + elemdict["user_ordered"] = True if user_ordered is not None \ + and user_ordered.arg.upper() == "USER" else False + this_object.append(elemdict) + has_children = True + + # Deal with the cases that the attribute does not have children. + if not has_children: + if element.keyword in ["leaf-list"]: + create_list = True + cls, elemtype = copy.deepcopy(build_elemtype(ctx, + element.search_one('type'))) + + # Determine what the default for the leaf should be where there are + # multiple available. + # Algorithm: + # - build a tree that is rooted on this class. + # - perform a breadth-first search - the first node found + # - that has the "default" leaf set, then we take this + # as the value for the default + + # then starting at the selected default node, traverse + # until we find a node that is declared to be a base_type + elemdefault = element.search_one('default') + default_type = False + quote_arg = False + if elemdefault is not None: + elemdefault = elemdefault.arg + default_type = elemtype + if isinstance(elemtype, list): + # this is a union, we should check whether any of the types + # immediately has a default + for i in elemtype: + if "default" in i[1]: + elemdefault = i[1]["default"] + default_type = i[1] + # default_type = i[1] + # mapped_elemtype = i[1] + elif "default" in elemtype: + # if the actual type defines the default, then we need to maintain + # this + elemdefault = elemtype["default"] + default_type = elemtype + + # we need to indicate that the default type for the class_map + # is str + tmp_class_map = copy.deepcopy(class_map) + tmp_class_map["enumeration"] = {"parent_type": "string"} + + if not default_type: + if isinstance(elemtype, list): + # this type has multiple parents + for i in elemtype: + if "parent_type" in i[1]: + if isinstance(i[1]["parent_type"], list): + to_visit = [j for j in i[1]["parent_type"]] + else: + to_visit = [i[1]["parent_type"]] + elif "parent_type" in elemtype: + if isinstance(elemtype["parent_type"], list): + to_visit = [i for i in elemtype["parent_type"]] + else: + to_visit = [elemtype["parent_type"]] + + checked = list() + while to_visit: + check = to_visit.pop(0) + if check not in checked: + checked.append(check) + if "parent_type" in tmp_class_map[check]: + if isinstance(tmp_class_map[check]["parent_type"], list): + to_visit.extend(tmp_class_map[check]["parent_type"]) + else: + to_visit.append(tmp_class_map[check]["parent_type"]) + + # checked now has the breadth-first search result + if elemdefault is None: + for option in checked: + if "default" in tmp_class_map[option]: + elemdefault = tmp_class_map[option]["default"] + default_type = tmp_class_map[option] + break + + if elemdefault is not None: + # we now need to check whether there's a need to + # find out what the base type is for this type + # we really expect a linear chain here. + + # if we have a tuple as the type here, this means that + # the default was set at a level where there was not + # a single option for the type. check the default + # against each option, to get a to a single default_type + if isinstance(default_type, list): + default_type = find_absolute_default_type(default_type, elemdefault, + element.arg) + + if not default_type["base_type"]: + if "parent_type" in default_type: + if isinstance(default_type["parent_type"], list): + to_visit = [i for i in default_type["parent_type"]] + else: + to_visit = [default_type["parent_type"]] + checked = list() + while to_visit: + check = to_visit.pop(0) # remove from the top of stack - depth first + if check not in checked: + checked.append(check) + if "parent_type" in tmp_class_map[check]: + if isinstance(tmp_class_map[check]["parent_type"], list): + to_visit.extend(tmp_class_map[check]["parent_type"]) + else: + to_visit.append(tmp_class_map[check]["parent_type"]) + default_type = tmp_class_map[checked.pop()] + if not default_type["base_type"]: + raise TypeError("default type was not a base type") + + # Set the default type based on what was determined above about the + # correct value to set. + if default_type: + quote_arg = default_type["quote_arg"] if "quote_arg" in \ + default_type else False + default_type = default_type["native_type"] + + elemconfig = class_bool_map[element.search_one('config').arg] if \ + element.search_one('config') else True + + elemname = safe_name(element.arg) + + # Deal with the cases that there is a requirement to create a list - these + # are leaf lists. There is some special handling for leaf-lists to ensure + # that the references are correctly created. + if create_list: + if not cls == "leafref": + cls = "leaf-list" + + if isinstance(elemtype, list): + c = 0 + allowed_types = [] + for subtype in elemtype: + # nested union within a leaf-list type + if isinstance(subtype, tuple): + if subtype[0] == "leaf-union": + for subelemtype in subtype[1]["native_type"]: + allowed_types.append(subelemtype) + else: + if isinstance(subtype[1]["native_type"], list): + allowed_types.extend(subtype[1]["native_type"]) + else: + allowed_types.append(subtype[1]["native_type"]) + else: + allowed_types.append(subtype["native_type"]) + else: + allowed_types = elemtype["native_type"] + else: + cls = "leafref-list" + allowed_types = { + "native_type": elemtype["native_type"], + "referenced_path": elemtype["referenced_path"], + "require_instance": elemtype["require_instance"], + } + elemntype = {"class": cls, "native_type": ("TypedListType", + allowed_types)} + + else: + if cls == "union" or cls == "leaf-union": + elemtype = {"class": cls, "native_type": ("UnionType", elemtype)} + elemntype = elemtype["native_type"] + + # Build the dictionary for the element with the relevant meta-data + # specified within it. + elemdict = { + "name": elemname, "type": elemntype, + "origtype": element.search_one('type').arg, "path": + safe_name(path), + "class": cls, "default": elemdefault, + "config": elemconfig, "defaulttype": default_type, + "quote_arg": quote_arg, + "description": elemdescr, "yang_name": element.arg, + "choice": choice, + "register_paths": register_paths, + } + if cls == "leafref": + elemdict["referenced_path"] = elemtype["referenced_path"] + elemdict["require_instance"] = elemtype["require_instance"] + + # In cases where there there are a set of interesting extensions specified + # then build a dictionary of these extension values to provide with the + # specific leaf for this element. + if element.substmts is not None and \ + ctx.opts.pybind_interested_exts is not None: + extensions = {} + for ext in element.substmts: + if ext.keyword[0] in ctx.opts.pybind_interested_exts: + if not ext.keyword[0] in extensions: + extensions[ext.keyword[0]] = {} + extensions[ext.keyword[0]][ext.keyword[1]] = ext.arg + if len(extensions): + elemdict["extensions"] = extensions + + this_object.append(elemdict) + return this_object diff --git a/pybind.py b/pybind.py deleted file mode 100644 index de1d3af1..00000000 --- a/pybind.py +++ /dev/null @@ -1,1537 +0,0 @@ -""" -Copyright 2015, Rob Shakir (rjs@jive.com, rjs@rob.sh) - -This project has been supported by: - * Jive Communications, Inc. - * BT plc. - -Licensed under the Apache License, Version 2.0 (the "License"); -you may not use this file except in compliance with the License. -You may obtain a copy of the License at - - http://www.apache.org/licenses/LICENSE-2.0 - -Unless required by applicable law or agreed to in writing, software -distributed under the License is distributed on an "AS IS" BASIS, -WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -See the License for the specific language governing permissions and -limitations under the License. -""" - -import optparse -import sys -import re -import string -import numpy as np -import decimal -import copy -import os -from bitarray import bitarray -from lib.yangtypes import safe_name, YANGBool - -from pyang import plugin -from pyang import statements - -DEBUG = True -if DEBUG: - import pprint - pp = pprint.PrettyPrinter(indent=2) - -# YANG is quite flexible in terms of what it allows as input to a boolean -# value, this map is used to provide a mapping of these values to the python -# True and False boolean instances. -class_bool_map = { - 'false': False, - 'False': False, - 'true': True, - 'True': True, -} - -class_map = { - # this map is dynamically built upon but defines how we take - # a YANG type and translate it into a native Python class - # along with other attributes that are required for this mapping. - # - # key: the name of the YANG type - # native_type: the Python class that is used to support this - # YANG type natively. - # map (optional): a map to take input values and translate them - # into valid values of the type. - # base_type: whether the class can be used as - # class(*args, **kwargs) in Python, or whether it is a - # derived class (such as is created based on a typedef, - # or for types that cannot be supported natively, such - # as enumeration, or a string - # with a restriction placed on it) - # quote_arg (opt): whether the argument to this class' __init__ needs to - # be quoted (e.g., str("hello")) in the code that is - # output. - # pytype (opt): A reference to the actual type that is used, this is - # used where we infer types, such as for an input - # value to a union since we need to actually compare - # the value against the __init__ method and see whether - # it works. - # parent_type (opt): for "derived" types, then we store what the enclosed - # type is such that we can create instances where - # required e.g., a restricted string will have a - # parent_type of a string. this can be a list if the - # type is a union. - # restriction ...: where the type is a restricted type, then the - # (optional) class_map dict entry can store more information about - # the type of restriction. this is generally used when - # we need to re-initialise an instance of the class, - # such as in the setter methods of containers. - # Other types may add their own types to this dictionary that have - # meaning only for themselves. For example, a ReferenceType can add the - # path it references, and whether the require-instance keyword was set - # or not. - # - 'boolean': { - "native_type": "YANGBool", - "map": class_bool_map, - "base_type": True, - "quote_arg": True, - "pytype": YANGBool - }, - 'binary': { - "native_type": "bitarray", - "base_type": True, - "quote_arg": True, - "pytype": bitarray - }, - 'uint8': { - "native_type": "np.uint8", - "base_type": True, - "pytype": np.uint8 - }, - 'uint16': { - "native_type": "np.uint16", - "base_type": True, - "pytype": np.uint16 - }, - 'uint32': { - "native_type": "np.uint32", - "base_type": True, - "pytype": np.uint32 - }, - 'uint64': { - "native_type": "np.uint64", - "base_type": True, - "pytype": np.uint64}, - 'string': { - "native_type": "unicode", - "base_type": True, - "quote_arg": True, - "pytype": unicode - }, - 'decimal64': { - "native_type": "Decimal", - "base_type": True, - "pytype": decimal.Decimal - }, - 'empty': { - "native_type": "YANGBool", - "map": class_bool_map, - "base_type": True, - "quote_arg": True, - "pytype": YANGBool - }, - 'int8': { - "native_type": "np.int8", - "base_type": True, - "pytype": np.int8 - }, - 'int16': { - "native_type": "np.int16", - "base_type": True, - "pytype": np.int16 - }, - 'int32': { - "native_type": "np.int32", - "base_type": True, - "pytype": np.int32 - }, - 'int64': { - "native_type": "np.int64", - "base_type": True, - "pytype": np.int64 - }, -} - -# We have a set of types which support "range" statements in RFC6020. This -# list determins types that should be allowed to have a "range" argument. -INT_RANGE_TYPES = ["uint8", "uint16", "uint32", "uint64", - "int8", "int16", "int32", "int64"] - - -# Base machinery to support operation as a plugin to pyang. -def pyang_plugin_init(): - plugin.register_plugin(PyangBindClass()) - - -class PyangBindClass(plugin.PyangPlugin): - def add_output_format(self, fmts): - # Add the 'pybind' output format to pyang. - self.multiple_modules = True - fmts['pybind'] = self - - def emit(self, ctx, modules, fd): - # When called, call the build_pyangbind function. - build_pybind(ctx, modules, fd) - - def add_opts(self, optparser): - # Add pyangbind specific operations to pyang. These are documented in the - # options, but are essentially divided into three sets. - # * xpathhelper - How pyangbind should deal with xpath expressions. - # This module is documented in lib/xpathhelper and describes how - # to support registration, updates, and retrieval of xpaths. - # * class output - whether a single file should be created, or whether - # a hierarchy of python modules should be created. The latter is - # preferable when one has large trees being compiled. - # * extensions - support for YANG extensions that pyangbind should look - # for, and add as a dictionary with each element. - optlist = [ - optparse.make_option("--use-xpathhelper", - dest="use_xpathhelper", - action="store_true", - help="""Use the xpathhelper module to - resolve leafrefs"""), - optparse.make_option("--split-class-dir", - metavar="DIR", - dest="split_class_dir", - help="""Split the code output into - multiple directories"""), - optparse.make_option("--pybind-class-dir", - metavar="DIR", - dest="pybind_class_dir", - help="""Path in which the pyangbind - 'lib' directionary can be found - - assumed to be the local - directory if this option - is not specified"""), - optparse.make_option("--interesting-extension", - metavar="EXTENSION-MODULE", - default=[], - action="append", - type=str, - dest="pybind_interested_exts", - help="""A set of extensions that - are interesting and should be - stored with the class. They - can be accessed through the - "extension_dict()" argument. - Multiple arguments can be - specified."""), - optparse.make_option("--use-extmethods", - dest="use_extmethods", - action="store_true", - help="""Allow a path-keyed dictionary - to be used to specify methods - related to a particular class"""), - optparse.make_option("--build-rpcs", - dest="build_rpcs", - action="store_true", - help="""Generate class bindings for - the input and output of RPCs - defined in each module. These - are placed at the root of - each module"""), - ] - g = optparser.add_option_group("pyangbind output specific options") - g.add_options(optlist) - - -# Core function to build the pyangbind output - starting with building the -# dependencies - and then working through the instantiated tree that pyang has -# already parsed. -def build_pybind(ctx, modules, fd): - # Restrict the output of the plugin to only the modules that are supplied - # to pyang. More modules are parsed by pyangbind to resolve typedefs and - # identities. - module_d = {} - for mod in modules: - module_d[mod.arg] = mod - pyang_called_modules = module_d.keys() - - # Bail if there are pyang errors, since this certainly means that the - # pyangbind output will fail - unless these are solely due to imports that - # we provided but then unused. - if len(ctx.errors): - for e in ctx.errors: - if not e[1] in ["UNUSED_IMPORT", "PATTERN_ERROR"]: - sys.stderr.write("FATAL: pyangbind cannot build module that pyang" + - " has found errors with.\n") - sys.exit(127) - - # Build the common set of imports that all pyangbind files needs - ctx.pybind_common_hdr = "" - if ctx.opts.pybind_class_dir: - # If we were asked to include a different directory for the library, then - # the header needs to extend the python system path to be able to include - # files from this directory. - libdir = os.path.abspath(ctx.opts.pybind_class_dir) - ctx.pybind_common_hdr += """import sys\n""" - ctx.pybind_common_hdr += """sys.path.append("%s")\n""" % libdir - ctx.pybind_common_hdr += "\n" - - ctx.pybind_common_hdr += "from operator import attrgetter\n" - if ctx.opts.use_xpathhelper: - ctx.pybind_common_hdr += "import lib.xpathhelper as xpathhelper\n" - ctx.pybind_common_hdr += """from lib.yangtypes import """ - ctx.pybind_common_hdr += """RestrictedPrecisionDecimalType, """ - ctx.pybind_common_hdr += """RestrictedClassType, TypedListType\n""" - ctx.pybind_common_hdr += """from lib.yangtypes import YANGBool, """ - ctx.pybind_common_hdr += """YANGListType, YANGDynClass, ReferenceType\n""" - ctx.pybind_common_hdr += """from lib.base import PybindBase\n""" - ctx.pybind_common_hdr += """from decimal import Decimal\n""" - ctx.pybind_common_hdr += """import numpy as np\n""" - ctx.pybind_common_hdr += """from bitarray import bitarray\n""" - - if not ctx.opts.split_class_dir: - fd.write(ctx.pybind_common_hdr) - else: - ctx.pybind_split_basepath = os.path.abspath(ctx.opts.split_class_dir) - if not os.path.exists(ctx.pybind_split_basepath): - os.makedirs(ctx.pybind_split_basepath) - - # Determine all modules, and submodules that are needed, along with the - # prefix that is used for it. We need to ensure that we understand all of the - # prefixes that might be used to reference an identity or a typedef. - all_mods = [] - for module in modules: - local_module_prefix = module.search_one('prefix') - if local_module_prefix is None: - local_module_prefix = \ - module.search_one('belongs-to').search_one('prefix') - if local_module_prefix is None: - raise AttributeError("A module (%s) must have a prefix or parent " + - "module") - local_module_prefix = local_module_prefix.arg - else: - local_module_prefix = local_module_prefix.arg - mods = [(local_module_prefix, module)] - # 'include' statements specify the submodules of the existing module - - # that also need to be parsed. - for i in module.search('include'): - subm = ctx.get_module(i.arg) - if subm is not None: - mods.append((local_module_prefix, subm)) - # 'import' statements specify the other modules that this module will - # reference. - for j in module.search('import'): - mod = ctx.get_module(j.arg) - if mod is not None: - imported_module_prefix = j.search_one('prefix').arg - mods.append((imported_module_prefix, mod)) - modules.append(mod) - all_mods.extend(mods) - - # remove duplicates from the list (same module and prefix) - new_all_mods = [] - for mod in all_mods: - if mod not in new_all_mods: - new_all_mods.append(mod) - all_mods = new_all_mods - - # Build a list of the 'typedef' and 'identity' statements that are included - # in the modules supplied. - defn = {} - for defnt in ['typedef', 'identity']: - defn[defnt] = {} - for m in all_mods: - t = find_definitions(defnt, ctx, m[1], m[0]) - for k in t: - if k not in defn[defnt]: - defn[defnt][k] = t[k] - - # Build the identities and typedefs (these are added to the class_map which - # is globally referenced). - build_identities(ctx, defn['identity']) - build_typedefs(ctx, defn['typedef']) - - # Iterate through the tree which pyang has built, solely for the modules - # that pyang was asked to build - for modname in pyang_called_modules: - module = module_d[modname] - mods = [module] - for i in module.search('include'): - subm = ctx.get_module(i.arg) - if subm is not None: - mods.append(subm) - for m in mods: - children = [ch for ch in module.i_children - if ch.keyword in statements.data_definition_keywords] - get_children(ctx, fd, children, m, m) - - if ctx.opts.build_rpcs: - rpcs = [ch for ch in module.i_children - if ch.keyword == 'rpc'] - # Build RPCs specifically under the module name, since this - # can be used as a proxy for the namespace. - if len(rpcs): - get_children(ctx, fd, rpcs, module, module, register_paths=False, - path="/%s_rpc" % (safe_name(module.arg))) - - -def build_identities(ctx, defnd): - # Build dicionaries which determine how identities work. Essentially, an - # identity is modelled such that it is a dictionary where the keys of that - # dictionary are the valid values for an identityref. - unresolved_idc = {} - for i in defnd: - unresolved_idc[i] = 0 - unresolved_ids = defnd.keys() - error_ids = [] - identity_d = {} - - # The order of an identity being built is important. Find those identities - # that either have no "base" statement, or have a known base statement, and - # queue these to be processed first. - while len(unresolved_ids): - ident = unresolved_ids.pop(0) - base = defnd[ident].search_one('base') - reprocess = False - if base is None and not unicode(ident) in identity_d: - identity_d[unicode(ident)] = {} - else: - # the identity has a base, so we need to check whether it - # exists already - if unicode(base.arg) in identity_d: - base_id = unicode(base.arg) - # if it did, then we can now define the value - we want to - # define it as both the resolved value (i.e., with the prefix) - # and the unresolved value. - if ":" in ident: - prefix, value = ident.split(":") - prefix, value = unicode(prefix), unicode(value) - if value not in identity_d[base_id]: - identity_d[base_id][value] = {} - if value not in identity_d: - identity_d[value] = {} - # check whether the base existed with the prefix that was - # used for this value too, as long as the base_id is not - # already resolved - if ":" not in base_id: - resolved_base = unicode("%s:%s" % (prefix, base_id)) - if resolved_base not in identity_d: - reprocess = True - else: - identity_d[resolved_base][ident] = {} - identity_d[resolved_base][value] = {} - if ident not in identity_d[base_id]: - identity_d[base_id][ident] = {} - if ident not in identity_d: - identity_d[ident] = {} - else: - reprocess = True - - if reprocess: - # Fall-out from the loop of resolving the identity. If we've looped - # around many times, we can't find a base for the identity, which means - # it is invalid. - if unresolved_idc[ident] > 1000: - sys.stderr.write("could not find a match for %s base: %s\n" % - (ident, base.arg)) - error_ids.append(ident) - else: - unresolved_ids.append(ident) - unresolved_idc[ident] += 1 - - # Remove those identities that do not have any members. This would remove - # identities that are solely bases, but have no other members. However, this - # is a problem if particular modules are compiled. - # for potential_identity in identity_d.keys(): - # if len(identity_d[potential_identity]) == 0: - # del identity_d[potential_identity] - - if error_ids: - raise TypeError("could not resolve identities %s" % error_ids) - - # Add entries to the class_map such that this identity can be referenced by - # elements that use this identity ref. - for i in identity_d: - id_type = {"native_type": """RestrictedClassType(base_type=unicode, """ + - """restriction_type="dict_key", """ + - """restriction_arg=%s,)""" % identity_d[i], - "restriction_argument": identity_d[i], - "restriction_type": "dict_key", - "parent_type": "string", - "base_type": False} - class_map[i] = id_type - - -def build_typedefs(ctx, defnd): - # Build the type definitions that are specified within a model. Since - # typedefs are essentially derived from existing types, order of processing - # is important - we need to go through and build the types in order where - # they have a known 'type'. - unresolved_tc = {} - for i in defnd: - unresolved_tc[i] = 0 - unresolved_t = defnd.keys() - error_ids = [] - known_types = class_map.keys() - known_types.append('enumeration') - known_types.append('leafref') - process_typedefs_ordered = [] - while len(unresolved_t): - - t = unresolved_t.pop(0) - base_t = defnd[t].search_one('type') - if base_t.arg == "union": - subtypes = [i for i in base_t.search('type')] - elif base_t.arg == "identityref": - subtypes = [base_t.search_one('base')] - else: - subtypes = [base_t] - - any_unknown = False - for i in subtypes: - if i.arg not in known_types: - any_unknown = True - if not any_unknown: - process_typedefs_ordered.append((t, defnd[t])) - known_types.append(t) - else: - unresolved_tc[t] += 1 - if unresolved_tc[t] > 1000: - # Take a similar approach to the resolution of identities. If we have a - # typedef that has a type in it that is not found after many iterations - # then we should bail. - error_ids.append(t) - sys.stderr.write("could not find a match for %s type -> %s\n" % - (t, [i.arg for i in subtypes])) - else: - unresolved_t.append(t) - - if error_ids: - raise TypeError("could not resolve typedefs %s" % error_ids) - - # Process the types that we built above. - for i_tuple in process_typedefs_ordered: - item = i_tuple[1] - type_name = i_tuple[0] - mapped_type = False - restricted_arg = False - # Copy the class_map entry - this is done so that we do not alter the - # existing instance in memory as we add to it. - cls, elemtype = copy.deepcopy(build_elemtype(ctx, item.search_one('type'))) - known_types = class_map.keys() - # Enumeration is a native type, but is not natively supported - # in the class_map, and hence we append it here. - known_types.append("enumeration") - known_types.append("leafref") - - # Don't allow duplicate definitions of types - if type_name in known_types: - raise TypeError("Duplicate definition of %s" % type_name) - default_stmt = item.search_one('default') - - # 'elemtype' is a list when the type includes a union, so we need to go - # through and build a type definition that supports multiple types. - if not isinstance(elemtype, list): - restricted = False - # Map the original type to the new type, parsing the additional arguments - # that may be specified, for example, a new default, a pattern that must - # be matched, or a length (stored in the restriction_argument, and - # restriction_type class_map variables). - class_map[type_name] = {"base_type": False} - class_map[type_name]["native_type"] = elemtype["native_type"] - if "parent_type" in elemtype: - class_map[type_name]["parent_type"] = elemtype["parent_type"] - else: - yang_type = item.search_one('type').arg - if yang_type not in known_types: - raise TypeError("typedef specified a native type that was not " + - "supported") - class_map[type_name]["parent_type"] = yang_type - if default_stmt is not None: - class_map[type_name]["default"] = default_stmt.arg - if "referenced_path" in elemtype: - class_map[type_name]["referenced_path"] = elemtype["referenced_path"] - class_map[type_name]["class_override"] = "leafref" - if "require_instance" in elemtype: - class_map[type_name]["require_instance"] = elemtype["require_instance"] - if "restriction_type" in elemtype: - class_map[type_name]["restriction_type"] = \ - elemtype["restriction_type"] - class_map[type_name]["restriction_argument"] = \ - elemtype["restriction_argument"] - if "quote_arg" in elemtype: - class_map[type_name]["quote_arg"] = elemtype["quote_arg"] - else: - # Handle a typedef that is a union - extended the class_map arguments - # to be a list that is parsed by the relevant dynamic type generation - # function. - native_type = [] - parent_type = [] - default = False if default_stmt is None else default_stmt.arg - for i in elemtype: - if isinstance(i[1]["native_type"], list): - native_type.extend(i[1]["native_type"]) - else: - native_type.append(i[1]["native_type"]) - if i[1]["yang_type"] in known_types: - parent_type.append(i[1]["yang_type"]) - else: - msg = "typedef in a union specified a native type that was not" - msg += "supported (%s in %s)" % (i[1]["yang_type"], item.arg) - raise TypeError(msg) - if "default" in i[1] and not default: - # When multiple 'default' values are specified within a union that - # is within a typedef, then pyangbind will choose the first one. - q = True if "quote_arg" in i[1] else False - default = (i[1]["default"], q) - class_map[type_name] = {"native_type": native_type, "base_type": False, - "parent_type": parent_type} - if default: - class_map[type_name]["default"] = default[0] - class_map[type_name]["quote_default"] = default[1] - - -def find_child_definitions(obj, defn, prefix, definitions): - for i in obj.search(defn): - if i.arg in definitions: - sys.stderr.write("WARNING: duplicate definition of %s" % i.arg) - else: - definitions["%s:%s" % (prefix, i.arg)] = i - definitions[i.arg] = i - - for ch in obj.search('grouping'): - if ch.i_children: - find_child_definitions(ch, defn, prefix, definitions) - - return definitions - - -def find_definitions(defn, ctx, module, prefix): - # Find the statements within a module that map to a particular type of - # statement, for instance - find typedefs, or identities, and reutrn them - # as a dictionary to the calling function. - mod = ctx.get_module(module.arg) - if mod is None: - raise AttributeError("expected to be able to find module %s, " % - (module.arg) + "but could not") - definitions = {} - defin = find_child_definitions(mod, defn, prefix, definitions) - return defin - - -def get_children(ctx, fd, i_children, module, parent, path=str(), - parent_cfg=True, choice=False, register_paths=True): - # Iterative function that is called for all elements that have childen - # data nodes in the tree. This function resolves those nodes into the - # relevant leaf, or container/list configuration and outputs the python - # code that corresponds to it to the relevant file. parent_cfg is used to - # ensure that where a parent container was set to config false, this is - # inherited by all elements below it; and choice is used to store whether - # these leaves are within a choice or not. - used_types, elements = [], [] - choices = False - - # When pyangbind was asked to split classes, then we need to create the - # relevant directories for the modules to be created into. In this case - # even though fd might be a valid file handle, we ignore it. - if ctx.opts.split_class_dir: - if path == "": - fpath = ctx.pybind_split_basepath + "/__init__.py" - else: - pparts = path.split("/") - npath = "/" - - # Check that we don't have the problem of containers that are nested - # with the same name - for i in range(1, len(pparts)): - if i > 0 and pparts[i] == pparts[i - 1]: - pname = safe_name(pparts[i]) + "_" - elif i == 1 and pparts[i] == module.arg: - pname = safe_name(pparts[i]) + "_" - else: - pname = safe_name(pparts[i]) - npath += pname + "/" - - bpath = ctx.pybind_split_basepath + npath - if not os.path.exists(bpath): - os.makedirs(bpath) - fpath = bpath + "/__init__.py" - if not os.path.exists(fpath): - try: - nfd = open(fpath, 'w') - except IOError, m: - raise IOError("could not open pyangbind output file (%s)" % m) - nfd.write(ctx.pybind_common_hdr) - else: - try: - nfd = open(fpath, 'a') - except IOError, w: - raise IOError("could not open pyangbind output file (%s)" % m) - else: - # If we weren't asked to split the files, then just use the file handle - # provided. - nfd = fd - - if parent_cfg: - # The first time we find a container that has config false set on it - # then we need to hand this down the tree - we don't need to look if - # parent_cfg has already been set to False as we need to inherit. - parent_config = parent.search_one('config') - if parent_config is not None: - parent_config = parent_config.arg - if parent_config.upper() == "FALSE": - # this container is config false - parent_cfg = False - - # When we are asked to split the classes into modules, then we need to find - # all elements that have their own class within this container, and make sure - # that they are imported. Additionally, we need to find the elements that are - # within a case, and ensure that these are built with the corresponding - # choice specified. - if ctx.opts.split_class_dir: - import_req = [] - - for ch in i_children: - if ch.keyword == "choice": - for choice_ch in ch.i_children: - # these are case statements - for case_ch in choice_ch.i_children: - elements += get_element(ctx, fd, case_ch, module, parent, - path + "/" + ch.arg, parent_cfg=parent_cfg, - choice=(ch.arg, choice_ch.arg), register_paths=register_paths) - else: - elements += get_element(ctx, fd, ch, module, parent, path + "/" + ch.arg, - parent_cfg=parent_cfg, choice=choice, register_paths=register_paths) - - if ctx.opts.split_class_dir: - if hasattr(ch, "i_children") and len(ch.i_children): - import_req.append(ch.arg) - - # Write out the import statements if needed. - if ctx.opts.split_class_dir: - if len(import_req): - for im in import_req: - if im == parent.arg: - im += "_" - nfd.write("""import %s\n""" % safe_name(im)) - - # 'container', 'module', 'list' and 'submodule' all have their own classes - # generated. - if parent.keyword in ["container", "module", "list", "submodule", "input", - "output", "rpc"]: - if ctx.opts.split_class_dir: - nfd.write("class %s(PybindBase):\n" % safe_name(parent.arg)) - else: - if not path == "": - nfd.write("class yc_%s_%s_%s(PybindBase):\n" % (safe_name(parent.arg), - safe_name(module.arg), safe_name(path.replace("/", "_")))) - else: - nfd.write("class %s(PybindBase):\n" % safe_name(parent.arg)) - - # If the container is actually a list, then determine what the key value - # is and store this such that we can give a hint. - keyval = False - if parent.keyword == "list": - keyval = parent.search_one('key').arg if parent.search_one('key') \ - is not None else False - if keyval and " " in keyval: - keyval = keyval.split(" ") - else: - keyval = [keyval] - - # Auto-generate a docstring based on the description that is provided in - # the YANG module. This aims to provide readability to someone perusing the - # code that is generated. - parent_descr = parent.search_one('description') - if parent_descr is not None: - parent_descr = "\n\n YANG Description: %s" % \ - parent_descr.arg.decode('utf8').encode('ascii', 'ignore') - else: - parent_descr = "" - - # Add more helper text. - nfd.write(''' """ - This class was auto-generated by the PythonClass plugin for PYANG - from YANG module %s - based on the path %s. Each member element of - the container is represented as a class variable - with a specific - YANG type.%s - """\n''' % (module.arg, (path if not path == "" else "/%s" % parent.arg), - parent_descr)) - else: - raise TypeError("unhandled keyword with children %s" % parent.keyword) - - elements_str = "" - if len(elements) == 0: - nfd.write(" pass\n") - else: - # We want to prevent a user from creating new attributes on a class that - # are not allowed within the data model - this uses the __slots__ magic - # variable of the class to restrict anyone from adding to these classes. - # Doing so gives an AttributeError when a user tries to specify something - # that was not in the model. - elements_str = "_pyangbind_elements = {" - slots_str = " __slots__ = ('_pybind_generated_by', '_path_helper'," - slots_str += " '_yang_name', '_extmethods', " - for i in elements: - slots_str += "'__%s'," % i["name"] - elements_str += "'%s': %s, " % (i["name"], i["name"]) - slots_str += ")\n" - elements_str += "}\n" - nfd.write(slots_str + "\n") - # Store the real name of the element - since we often get values that are - # not allowed in python as identifiers, but we need the real-name when - # creating instance documents (e.g., peer-group is not valid due to '-'). - nfd.write(" _yang_name = '%s'\n" % (parent.arg)) - - choices = {} - choice_attrs = [] - classes = {} - for i in elements: - # Loop through the elements and build a string that corresponds to the - # class that is going to be created. In all cases (thus far) this uses - # the YANGDynClass helper function to generate a dynamic type. This - # can extend the base type that is provided, and does this to give us - # some attributes that base classes such as int(), or str() don't have - - # but YANG needs (such as a default value, the original YANG name, any - # extension that were provided with the leaf, etc.). - class_str = {} - if "default" in i and not i["default"] is None: - default_arg = "\"%s\"" % (i["default"]) if i["quote_arg"] else "%s" \ - % i["default"] - - if i["class"] == "leaf-list": - # Map a leaf-list to the type specified in the class map. This is a - # TypedList (see lib.yangtypes) with a particular set of types allowed. - class_str["name"] = "__%s" % (i["name"]) - class_str["type"] = "YANGDynClass" - class_str["arg"] = "base=" - if isinstance(i["type"]["native_type"][1], list): - allowed_type = "[" - for subtype in i["type"]["native_type"][1]: - allowed_type += "%s," % subtype - allowed_type += "]" - else: - allowed_type = "%s" % (i["type"]["native_type"][1]) - class_str["arg"] += "%s(allowed_type=%s)" % \ - (i["type"]["native_type"][0], allowed_type) - if "default" in i and not i["default"] is None: - class_str["arg"] += ", default=%s(%s)" % (i["defaulttype"], - default_arg) - elif i["class"] == "list": - # Map a list to YANGList class - this is dynamically derived by the - # YANGListType function to have the relevant characteristics, such as - # whether it is ordered by the user. - class_str["name"] = "__%s" % (i["name"]) - class_str["type"] = "YANGDynClass" - class_str["arg"] = "base=YANGListType(" - class_str["arg"] += "%s,%s" % ("\"%s\"" % i["key"] if i["key"] - else False, i["type"]) - class_str["arg"] += ", yang_name=\"%s\", parent=self" \ - % (i["yang_name"]) - class_str["arg"] += ", is_container='list', user_ordered=%s" \ - % i["user_ordered"] - class_str["arg"] += ", path_helper=self._path_helper" - if i["choice"]: - class_str["arg"] += ", choice=%s" % repr(choice) - class_str["arg"] += ")" - elif i["class"] == "union" or i["class"] == "leaf-union": - # A special mapped type where there is a union that just includes - # leaves this is mapped to a particular Union type, and valid types - # within it. The dynamically generated class will determine whether - # the input can be mapped to the types included in the union. - class_str["name"] = "__%s" % (i["name"]) - class_str["type"] = "YANGDynClass" - class_str["arg"] = "base=[" - for u in i["type"][1]: - if isinstance(u[1]["native_type"], list): - for su_native_type in u[1]["native_type"]: - class_str["arg"] += "%s," % su_native_type - else: - class_str["arg"] += "%s," % u[1]["native_type"] - class_str["arg"] += "]" - if "default" in i and not i["default"] is None: - class_str["arg"] += ", default=%s(%s)" % (i["defaulttype"], - default_arg) - elif i["class"] == "leafref": - # A leafref, pyangbind uses the special ReferenceType which performs a - # lookup against the path_helper class provided. - class_str["name"] = "__%s" % (i["name"]) - class_str["type"] = "YANGDynClass" - class_str["arg"] = "base=%s" % i["type"] - class_str["arg"] += "(referenced_path='%s'" % i["referenced_path"] - class_str["arg"] += ", caller=self._path() + ['%s'], " \ - % (i["yang_name"]) - class_str["arg"] += "path_helper=self._path_helper, " - class_str["arg"] += "require_instance=%s)" % (i["require_instance"]) - elif i["class"] == "leafref-list": - # Deal with the special case of a list of leafrefs, since the - # ReferenceType has different arguments that need to be provided to the - # class to properly initialise. - class_str["name"] = "__%s" % (i["name"]) - class_str["type"] = "YANGDynClass" - class_str["arg"] = "base=%s" % i["type"]["native_type"][0] - class_str["arg"] += "(allowed_type=%s(referenced_path='%s'," \ - % (i["type"]["native_type"][1]["native_type"], - i["type"]["native_type"][1]["referenced_path"]) - class_str["arg"] += "caller=self._path() + ['%s'], " % i["yang_name"] - class_str["arg"] += "path_helper=self._path_helper, " - class_str["arg"] += "require_instance=%s))" % \ - (i["type"]["native_type"][1]["require_instance"]) - else: - # Generically handle all other classes with the 'standard' mappings. - class_str["name"] = "__%s" % (i["name"]) - class_str["type"] = "YANGDynClass" - if isinstance(i["type"], list): - class_str["arg"] = "base=[" - for u in i["type"]: - class_str["arg"] += "%s," % u - class_str["arg"] += "]" - else: - class_str["arg"] = "base=%s" % i["type"] - if "default" in i and not i["default"] is None: - class_str["arg"] += ", default=%s(%s)" % (i["defaulttype"], - default_arg) - if i["class"] == "container": - class_str["arg"] += ", is_container='container'" - elif i["class"] == "list": - class_str["arg"] += ", is_container='list'" - elif i["class"] == "leaf-list": - class_str["arg"] += ", is_leaf=False" - else: - class_str["arg"] += ", is_leaf=True" - if class_str["arg"]: - class_str["arg"] += ", yang_name=\"%s\"" % i["yang_name"] - class_str["arg"] += ", parent=self" - if i["choice"]: - class_str["arg"] += ", choice=%s" % repr(i["choice"]) - choice_attrs.append(i["name"]) - if not i["choice"][0] in choices: - choices[i["choice"][0]] = {} - if not i["choice"][1] in choices[i["choice"][0]]: - choices[i["choice"][0]][i["choice"][1]] = [] - choices[i["choice"][0]][i["choice"][1]].append(i["name"]) - class_str["arg"] += ", path_helper=self._path_helper" - class_str["arg"] += ", extmethods=self._extmethods" - class_str["arg"] += ", register_paths=%s" % i["register_paths"] - if "extensions" in i: - class_str["arg"] += ", extensions=%s" % i["extensions"] - if keyval and i["yang_name"] in keyval: - class_str["arg"] += ", is_keyval=True" - classes[i["name"]] = class_str - - # TODO: get and set methods currently have errors that are reported that - # are a bit ugly. The intention here is to act like an immutable type - - # such that new class instances are created each time that the value is - # set. - - # Generic class __init__, set up the path_helper if asked to. - nfd.write(""" - _pybind_generated_by = 'container' - - def __init__(self, *args, **kwargs):\n""") - if ctx.opts.use_xpathhelper: - nfd.write(""" - helper = kwargs.pop("path_helper", None) - if helper is False: - self._path_helper = False - elif helper is not None and isinstance(helper, xpathhelper.YANGPathHelper): - self._path_helper = helper - elif hasattr(self, "_parent"): - helper = getattr(self._parent, "_path_helper", False) - self._path_helper = helper - else: - self._path_helper = False\n""") - else: - nfd.write(""" - self._path_helper = False\n""") - - if ctx.opts.use_extmethods: - nfd.write(""" - extmethods = kwargs.pop("extmethods", None) - if extmethods is False: - self._extmethods = False - elif extmethods is not None and isinstance(extmethods, dict): - self._extmethods = extmethods - elif hasattr(self, "_parent"): - extmethods = getattr(self._parent, "_extmethods", None) - self._extmethods = extmethods - else: - self._extmethods = False\n""") - else: - nfd.write(""" - self._extmethods = False\n""") - - # Write out the classes that are stored locally as self.__foo where - # foo is the safe YANG name. - for c in classes: - nfd.write(" self.%s = %s(%s)\n" % (classes[c]["name"], - classes[c]["type"], classes[c]["arg"])) - # Don't accept arguments to a container/list/submodule class - nfd.write(""" - if args: - if len(args) > 1: - raise TypeError("cannot create a YANG container with >1 argument") - all_attr = True - for e in self._pyangbind_elements: - if not hasattr(args[0], e): - all_attr = False - break - if not all_attr: - raise ValueError("Supplied object did not have the correct attributes") - for e in self._pyangbind_elements: - setmethod = getattr(self, "_set_%s" % e) - setmethod(getattr(args[0], e))\n""") - - # A generic method to provide a path() method on each container, that gives - # a path in the form of a list that describes the nodes in the hierarchy. - nfd.write(""" - def _path(self): - if hasattr(self, "_parent"): - return self._parent._path()+[self._yang_name] - else: - return %s\n""" % path.split("/")[1:]) - node = {} - - # For each element, write out a getter and setter method - with the doc - # string of the element within the model. - for i in elements: - c_str = classes[i["name"]] - description_str = "" - if i["description"]: - description_str = "\n\n YANG Description: %s" \ - % i["description"].decode('utf-8').encode('ascii', 'ignore') - nfd.write(''' - def _get_%s(self): - """ - Getter method for %s, mapped from YANG variable %s (%s)%s - """ - return self.__%s - ''' % (i["name"], i["name"], i["path"], i["origtype"], - description_str, i["name"])) - - nfd.write(''' - def _set_%s(self,v): - """ - Setter method for %s, mapped from YANG variable %s (%s) - If this variable is read-only (config: false) in the - source YANG file, then _set_%s is considered as a private - method. Backends looking to populate this variable should - do so via calling thisObj._set_%s() directly.%s - """''' % (i["name"], i["name"], i["path"], - i["origtype"], i["name"], i["name"], - description_str)) - nfd.write(""" - try: - t = %s(v,%s)""" % (c_str["type"], c_str["arg"])) - nfd.write(""" - except (TypeError, ValueError): - raise ValueError(\"\"\"%s must be of a type compatible with %s\"\"\") - self.__%s = t\n""" % (i["name"], c_str["arg"], i["name"])) - nfd.write(" if hasattr(self, '_set'):\n") - nfd.write(" self._set()\n") - - # When we want to return a value to its default, the unset method can - # be used. Generally, this is done in a choice where one branch needs to - # be set to the default, but may be used wherever re-initialiation of - # the object is required. - nfd.write(""" - def _unset_%s(self): - self.__%s = %s(%s)\n\n""" % (i["name"], i["name"], - c_str["type"], c_str["arg"],)) - - # When an element is read-only, write out the _set and _get methods, but - # we don't actually make the property object accessible. This ensures that - # where backends are populating the model, then they can do so via the - # _set_X method - but a 'normal' user can't just do container.X = 10. - for i in elements: - rw = True - if not i["config"]: - rw = False - elif not parent_cfg: - rw = False - elif keyval and i["yang_name"] in keyval: - rw = False - - if not rw: - nfd.write(""" %s = property(_get_%s)\n""" % (i["name"], i["name"])) - else: - nfd.write(""" %s = property(_get_%s, _set_%s)\n""" % - (i["name"], i["name"], i["name"])) - nfd.write("\n") - - # Store a list of the choices that are included within this module such that - # we can enforce each brnahc. - if choices: - nfd.write(" __choices__ = %s" % repr(choices)) - nfd.write("""\n %s\n""" % elements_str) - nfd.write("\n") - - if ctx.opts.split_class_dir: - nfd.close() - - return None - - -def build_elemtype(ctx, et, prefix=False): - # Build a dictionary which defines the type for the element. This is used - # both in the case that a typedef needs to be built, as well as on per-list - # basis. - cls = None - pattern_stmt = et.search_one('pattern') if not et.search_one('pattern') \ - is None else False - range_stmt = et.search_one('range') if not et.search_one('range') \ - is None else False - length_stmt = et.search_one('length') if not et.search_one('length') \ - is None else False - - # Determine whether there are any restrictions that are placed on this leaf, - # and build a dictionary of the different restrictions to be placed on the - # type. - restrictions = {} - if pattern_stmt: - restrictions['pattern'] = pattern_stmt.arg - - if length_stmt: - if "|" in length_stmt.arg: - restrictions['length'] = [i.replace(' ', '') for i in - length_stmt.arg.split("|")] - else: - restrictions['length'] = [length_stmt.arg] - - if range_stmt: - # Complex ranges are separated by pipes - if "|" in range_stmt.arg: - restrictions['range'] = [i.replace(' ', '') for i in - range_stmt.arg.split("|")] - else: - restrictions['range'] = [range_stmt.arg] - - # Build RestrictedClassTypes based on the compiled dictionary and the - # underlying base type. - if len(restrictions): - if 'length' in restrictions or 'pattern' in restrictions: - cls = "restricted-%s" % (et.arg) - elemtype = { - "native_type": - """RestrictedClassType(base_type=%s, restriction_dict=%s)""" - % (class_map[et.arg]["native_type"], repr(restrictions)), - "restriction_dict": restrictions, - "parent_type": et.arg, - "base_type": False, - } - elif 'range' in restrictions: - cls = "restricted-%s" % et.arg - elemtype = { - "native_type": - """RestrictedClassType(base_type=%s, restriction_dict=%s)""" - % (class_map[et.arg]["native_type"], repr(restrictions)), - "restriction_dict": restrictions, - "parent_type": et.arg, - "base_type": False, - } - - # Handle all other types of leaves that are not restricted classes. - if cls is None: - cls = "leaf" - # Enumerations are built as RestrictedClasses where the value that is - # provided to the class is check against the keys of a dictionary. - if et.arg == "enumeration": - enumeration_dict = {} - for enum in et.search('enum'): - enumeration_dict[unicode(enum.arg)] = {} - val = enum.search_one('value') - if val is not None: - enumeration_dict[unicode(enum.arg)]["value"] = int(val.arg) - elemtype = {"native_type": """RestrictedClassType(base_type=unicode, \ - restriction_type="dict_key", \ - restriction_arg=%s,)""" % - (enumeration_dict), - "restriction_argument": enumeration_dict, - "restriction_type": "dict_key", - "parent_type": "string", - "base_type": False} - # Map decimal64 to a RestrictedPrecisionDecimalType - this is there to - # ensure that the fraction-digits argument can be implemented. Note that - # fraction-digits is a mandatory argument. - elif et.arg == "decimal64": - fd_stmt = et.search_one('fraction-digits') - if fd_stmt is not None: - cls = "restricted-decimal64" - elemtype = {"native_type": - """RestrictedPrecisionDecimalType(precision=%s)""" % - fd_stmt.arg, "base_type": False, - "parent_type": "decimal64"} - else: - elemtype = class_map[et.arg] - # Handle unions - build a list of the supported types that are under the - # union. - elif et.arg == "union": - elemtype = [] - for uniontype in et.search('type'): - elemtype_s = copy.deepcopy(build_elemtype(ctx, uniontype)) - elemtype_s[1]["yang_type"] = uniontype.arg - elemtype.append(elemtype_s) - cls = "union" - # Map leafrefs to a ReferenceType, handling the referenced path, and - # whether require-instance is set. When xpathhelper is not specified, then - # no such mapping is done - at this point, we solely map to a string. - elif et.arg == "leafref": - path_stmt = et.search_one('path') - if path_stmt is None: - raise ValueError("leafref specified with no path statement") - require_instance = \ - class_bool_map[et.search_one('require-instance').arg] \ - if et.search_one('require-instance') \ - is not None else False - if ctx.opts.use_xpathhelper: - elemtype = {"native_type": "ReferenceType", - "referenced_path": path_stmt.arg, - "parent_type": "string", - "base_type": False, - "require_instance": require_instance} - cls = "leafref" - else: - elemtype = { - "native_type": "unicode", - "parent_type": "string", - "base_type": False, - } - # Handle identityrefs, but check whether there is a valid base where this - # has been specified. - elif et.arg == "identityref": - base_stmt = et.search_one('base') - if base_stmt is None: - raise ValueError("identityref specified with no base statement") - try: - elemtype = class_map[base_stmt.arg] - except KeyError: - sys.stderr.write("FATAL: identityref with an unknown base\n") - if DEBUG: - pp.pprint(class_map.keys()) - pp.pprint(et.arg) - pp.pprint(base_stmt.arg) - sys.exit(127) - else: - # For all other cases, then we should be able to look up directly in the - # class_map for the defined type, since these are not 'derived' types - # at this point. In the case that we are referencing a type that is a - # typedef, then this has been added to the class_map. - try: - elemtype = class_map[et.arg] - except KeyError: - passed = False - if prefix: - try: - tmp_name = "%s:%s" % (prefix, et.arg) - elemtype = class_map[tmp_name] - passed = True - except: - pass - if passed is False: - sys.stderr.write("FATAL: unmapped type (%s)\n" % (et.arg)) - if DEBUG: - pp.pprint(class_map.keys()) - pp.pprint(et.arg) - pp.pprint(prefix) - sys.exit(127) - if isinstance(elemtype, list): - cls = "leaf-union" - elif "class_override" in elemtype: - # this is used to propagate the fact that in some cases the - # native type needs to be dynamically built (e.g., leafref) - cls = elemtype["class_override"] - return (cls, elemtype) - - -def find_absolute_default_type(default_type, default_value, elemname): - if not isinstance(default_type, list): - return default_type - - for i in default_type: - if not i[1]["base_type"]: - test_type = class_map[i[1]["parent_type"]] - else: - test_type = i[1] - try: - tmp = test_type["pytype"](default_value) - default_type = test_type - break - except ValueError: - pass - return find_absolute_default_type(default_type, default_value, elemname) - - -def get_element(ctx, fd, element, module, parent, path, - parent_cfg=True, choice=False, register_paths=True): - # Handle mapping of an invidual element within the model. This function - # produces a dictionary that can then be mapped into the relevant code that - # dynamically generates a class. - - this_object = [] - default = False - has_children = False - create_list = False - - elemdescr = element.search_one('description') - if elemdescr is None: - elemdescr = False - else: - elemdescr = elemdescr.arg - - # If the element has an i_children attribute then this is a container, list - # leaf-list or choice. Alternatively, it can be the 'input' or 'output' - # substmts of an RPC - if hasattr(element, 'i_children'): - if element.keyword in ["container", "list", "input", "output"]: - has_children = True - elif element.keyword in ["leaf-list"]: - create_list = True - - # Fixup the path when within a choice, because this iteration belives that - # we are under a new container, but this does not exist in the path. - if element.keyword in ["choice"]: - path_parts = path.split("/") - npath = "" - for i in range(0, len(path_parts) - 1): - npath += "%s/" % path_parts[i] - npath.rstrip("/") - else: - npath = path - - # Create an element for a container. - if element.i_children: - chs = element.i_children - get_children(ctx, fd, chs, module, element, npath, parent_cfg=parent_cfg, - choice=choice, register_paths=register_paths) - - elemdict = { - "name": safe_name(element.arg), "origtype": element.keyword, - "class": element.keyword, - "path": safe_name(npath), "config": True, - "description": elemdescr, - "yang_name": element.arg, - "choice": choice, - "register_paths": register_paths, - } - # Handle the different cases of class name, this depends on whether we - # were asked to split the bindings into a directory structure or not. - if ctx.opts.split_class_dir: - # If we were dealing with split classes, then rather than naming the - # class based on a unique intra-file name - and rather we must import - # the relative path to the module.class - if element.arg == parent.arg: - modname = safe_name(element.arg) + "_" - else: - modname = safe_name(element.arg) - elemdict["type"] = "%s.%s" % (modname, safe_name(element.arg)) - - else: - # Otherwise, give a unique name for the class within the dictionary. - elemdict["type"] = "yc_%s_%s_%s" % (safe_name(element.arg), - safe_name(module.arg), - safe_name(path.replace("/", "_"))) - - # Deal with specific cases for list - such as the key and how it is - # ordered. - if element.keyword == "list": - elemdict["key"] = safe_name(element.search_one("key").arg) \ - if element.search_one("key") is not None else False - user_ordered = element.search_one('ordered-by') - elemdict["user_ordered"] = True if user_ordered is not None \ - and user_ordered.arg.upper() == "USER" else False - this_object.append(elemdict) - has_children = True - - # Deal with the cases that the attribute does not have children. - if not has_children: - if element.keyword in ["leaf-list"]: - create_list = True - cls, elemtype = copy.deepcopy(build_elemtype(ctx, - element.search_one('type'))) - - # Determine what the default for the leaf should be where there are - # multiple available. - # Algorithm: - # - build a tree that is rooted on this class. - # - perform a breadth-first search - the first node found - # - that has the "default" leaf set, then we take this - # as the value for the default - - # then starting at the selected default node, traverse - # until we find a node that is declared to be a base_type - elemdefault = element.search_one('default') - default_type = False - quote_arg = False - if elemdefault is not None: - elemdefault = elemdefault.arg - default_type = elemtype - if isinstance(elemtype, list): - # this is a union, we should check whether any of the types - # immediately has a default - for i in elemtype: - if "default" in i[1]: - elemdefault = i[1]["default"] - default_type = i[1] - # default_type = i[1] - # mapped_elemtype = i[1] - elif "default" in elemtype: - # if the actual type defines the default, then we need to maintain - # this - elemdefault = elemtype["default"] - default_type = elemtype - - # we need to indicate that the default type for the class_map - # is str - tmp_class_map = copy.deepcopy(class_map) - tmp_class_map["enumeration"] = {"parent_type": "string"} - - if not default_type: - if isinstance(elemtype, list): - # this type has multiple parents - for i in elemtype: - if "parent_type" in i[1]: - if isinstance(i[1]["parent_type"], list): - to_visit = [j for j in i[1]["parent_type"]] - else: - to_visit = [i[1]["parent_type"]] - elif "parent_type" in elemtype: - if isinstance(elemtype["parent_type"], list): - to_visit = [i for i in elemtype["parent_type"]] - else: - to_visit = [elemtype["parent_type"]] - - checked = list() - while to_visit: - check = to_visit.pop(0) - if check not in checked: - checked.append(check) - if "parent_type" in tmp_class_map[check]: - if isinstance(tmp_class_map[check]["parent_type"], list): - to_visit.extend(tmp_class_map[check]["parent_type"]) - else: - to_visit.append(tmp_class_map[check]["parent_type"]) - - # checked now has the breadth-first search result - if elemdefault is None: - for option in checked: - if "default" in tmp_class_map[option]: - elemdefault = tmp_class_map[option]["default"] - default_type = tmp_class_map[option] - break - - if elemdefault is not None: - # we now need to check whether there's a need to - # find out what the base type is for this type - # we really expect a linear chain here. - - # if we have a tuple as the type here, this means that - # the default was set at a level where there was not - # a single option for the type. check the default - # against each option, to get a to a single default_type - if isinstance(default_type, list): - default_type = find_absolute_default_type(default_type, elemdefault, - element.arg) - - if not default_type["base_type"]: - if "parent_type" in default_type: - if isinstance(default_type["parent_type"], list): - to_visit = [i for i in default_type["parent_type"]] - else: - to_visit = [default_type["parent_type"]] - checked = list() - while to_visit: - check = to_visit.pop(0) # remove from the top of stack - depth first - if check not in checked: - checked.append(check) - if "parent_type" in tmp_class_map[check]: - if isinstance(tmp_class_map[check]["parent_type"], list): - to_visit.extend(tmp_class_map[check]["parent_type"]) - else: - to_visit.append(tmp_class_map[check]["parent_type"]) - default_type = tmp_class_map[checked.pop()] - if not default_type["base_type"]: - raise TypeError("default type was not a base type") - - # Set the default type based on what was determined above about the - # correct value to set. - if default_type: - quote_arg = default_type["quote_arg"] if "quote_arg" in \ - default_type else False - default_type = default_type["native_type"] - - elemconfig = class_bool_map[element.search_one('config').arg] if \ - element.search_one('config') else True - - elemname = safe_name(element.arg) - - # Deal with the cases that there is a requirement to create a list - these - # are leaf lists. There is some special handling for leaf-lists to ensure - # that the references are correctly created. - if create_list: - if not cls == "leafref": - cls = "leaf-list" - - if isinstance(elemtype, list): - c = 0 - allowed_types = [] - for subtype in elemtype: - # nested union within a leaf-list type - if isinstance(subtype, tuple): - if subtype[0] == "leaf-union": - for subelemtype in subtype[1]["native_type"]: - allowed_types.append(subelemtype) - else: - if isinstance(subtype[1]["native_type"], list): - allowed_types.extend(subtype[1]["native_type"]) - else: - allowed_types.append(subtype[1]["native_type"]) - else: - allowed_types.append(subtype["native_type"]) - else: - allowed_types = elemtype["native_type"] - else: - cls = "leafref-list" - allowed_types = { - "native_type": elemtype["native_type"], - "referenced_path": elemtype["referenced_path"], - "require_instance": elemtype["require_instance"], - } - elemntype = {"class": cls, "native_type": ("TypedListType", - allowed_types)} - - else: - if cls == "union" or cls == "leaf-union": - elemtype = {"class": cls, "native_type": ("UnionType", elemtype)} - elemntype = elemtype["native_type"] - - # Build the dictionary for the element with the relevant meta-data - # specified within it. - elemdict = { - "name": elemname, "type": elemntype, - "origtype": element.search_one('type').arg, "path": - safe_name(path), - "class": cls, "default": elemdefault, - "config": elemconfig, "defaulttype": default_type, - "quote_arg": quote_arg, - "description": elemdescr, "yang_name": element.arg, - "choice": choice, - "register_paths": register_paths, - } - if cls == "leafref": - elemdict["referenced_path"] = elemtype["referenced_path"] - elemdict["require_instance"] = elemtype["require_instance"] - - # In cases where there there are a set of interesting extensions specified - # then build a dictionary of these extension values to provide with the - # specific leaf for this element. - if element.substmts is not None and \ - ctx.opts.pybind_interested_exts is not None: - extensions = {} - for ext in element.substmts: - if ext.keyword[0] in ctx.opts.pybind_interested_exts: - if not ext.keyword[0] in extensions: - extensions[ext.keyword[0]] = {} - extensions[ext.keyword[0]][ext.keyword[1]] = ext.arg - if len(extensions): - elemdict["extensions"] = extensions - - this_object.append(elemdict) - return this_object diff --git a/pybind.py b/pybind.py new file mode 120000 index 00000000..b4567c33 --- /dev/null +++ b/pybind.py @@ -0,0 +1 @@ +Pyangbind/plugin/pybind.py \ No newline at end of file diff --git a/requirements.txt b/requirements.txt index 5d0d3c77..a8f3955c 100644 --- a/requirements.txt +++ b/requirements.txt @@ -1,3 +1,4 @@ numpy pyang bitarray +lxml diff --git a/setup.py b/setup.py new file mode 100644 index 00000000..0c75f2d6 --- /dev/null +++ b/setup.py @@ -0,0 +1,42 @@ +from setuptools import setup, find_packages +from codecs import open +from os import path + +thisdir = path.abspath(path.dirname(__file__)) + +with open(path.join(thisdir, "README.rst"), encoding='utf-8') as readme: + long_description = readme.read() + +setup( + name='pyangbind', + + # PyangBind uses the same versioning approach as OpenConfig - see + # http://www.openconfig.net/file-cabinet/Semantic_Versioning_for_OpenConfig.pdf?attredirects=0&d=1 + version='0.2.0', + + description="PyangBind is a plugin for pyang which converts YANG data" + \ + "models into a Python class hierarchy, such that Python " + \ + "can be used to manipulate data that conforms with a YANG" + \ + " model.", + long_description=long_description, + + url="https://github.com/robshakir/pyangbind", + + author="Rob Shakir", + author_email="rjs@rob.sh", + + license="Apache", + classifiers=[ + 'Development Status :: 4 - Beta', + 'Intended Audience :: Telecommunications Industry', + 'Intended Audience :: Developers', + 'Topic :: Software Development :: Code Generators', + 'License :: OSI Approved :: Apache Software License', + 'Programming Language :: Python :: 2.7', + 'Programming Language :: Python :: 2 :: Only' + ], + include_package_data=True, + keywords="yang pyang", + packages=find_packages(exclude=['lib']), + install_requires=['numpy', 'pyang', 'bitarray', 'lxml'], +) diff --git a/tests/binary/lib b/tests/binary/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/binary/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/binary/run.py b/tests/binary/run.py index 784e32a2..d803a22c 100755 --- a/tests/binary/run.py +++ b/tests/binary/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,8 +31,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import binary as b from bitarray import bitarray diff --git a/tests/boolean-empty/lib b/tests/boolean-empty/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/boolean-empty/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/boolean-empty/run.py b/tests/boolean-empty/run.py index 3d4c0586..452f0d3c 100755 --- a/tests/boolean-empty/run.py +++ b/tests/boolean-empty/run.py @@ -21,6 +21,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -29,8 +32,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import boolean_empty as b diff --git a/tests/choice/lib b/tests/choice/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/choice/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/choice/run.py b/tests/choice/run.py index 2ff0f2d5..3eacd0c4 100755 --- a/tests/choice/run.py +++ b/tests/choice/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: keepfiles = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,8 +31,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import choice t = choice() diff --git a/tests/config-false/lib b/tests/config-false/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/config-false/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/config-false/run.py b/tests/config-false/run.py index d9afbf70..83629270 100755 --- a/tests/config-false/run.py +++ b/tests/config-false/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: keepfiles = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,8 +31,12 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import config_false diff --git a/tests/decimal64/lib b/tests/decimal64/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/decimal64/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/decimal64/run.py b/tests/decimal64/run.py index 614d9d5b..ade06f53 100755 --- a/tests/decimal64/run.py +++ b/tests/decimal64/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,8 +31,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import decimal_ as d from decimal import Decimal diff --git a/tests/enumeration/lib b/tests/enumeration/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/enumeration/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/enumeration/run.py b/tests/enumeration/run.py index cab75d46..d4023565 100755 --- a/tests/enumeration/run.py +++ b/tests/enumeration/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,8 +31,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import enumeration t = enumeration() diff --git a/tests/identityref/lib b/tests/identityref/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/identityref/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/identityref/run.py b/tests/identityref/run.py index 48135da9..974578d3 100755 --- a/tests/identityref/run.py +++ b/tests/identityref/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: keepfiles = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,10 +31,12 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind \ - -p %s \ - -o %s/bindings.py %s/%s.yang" % (pyangpath, pyangbindpath, - this_dir, this_dir, this_dir, TESTNAME)) + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import identityref diff --git a/tests/include-import/lib b/tests/include-import/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/include-import/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/include-import/run.py b/tests/include-import/run.py index 50101872..ffafa803 100755 --- a/tests/include-import/run.py +++ b/tests/include-import/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: keepfiles = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,10 +31,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind \ - -p %s \ - -o %s/bindings.py %s/%s.yang" % (pyangpath, pyangbindpath, - this_dir, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import include_import diff --git a/tests/int/lib b/tests/int/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/int/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/int/run.py b/tests/int/run.py index d622777b..b5499840 100755 --- a/tests/int/run.py +++ b/tests/int/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,8 +31,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import int_ as i diff --git a/tests/leaf-list/lib b/tests/leaf-list/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/leaf-list/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/leaf-list/run.py b/tests/leaf-list/run.py index 7ce94443..5a570b92 100755 --- a/tests/leaf-list/run.py +++ b/tests/leaf-list/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,8 +31,14 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) + from bindings import leaflist leaflist_instance = leaflist() diff --git a/tests/list/lib b/tests/list/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/list/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/list/run.py b/tests/list/run.py index d40b4e9b..68e9328f 100755 --- a/tests/list/run.py +++ b/tests/list/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,8 +31,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import list_ as l diff --git a/tests/nested-containers/lib b/tests/nested-containers/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/nested-containers/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/nested-containers/run.py b/tests/nested-containers/run.py index bf006cda..9174a1a3 100755 --- a/tests/nested-containers/run.py +++ b/tests/nested-containers/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,8 +31,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import nested diff --git a/tests/openconfig-bgp-juniper/lib b/tests/openconfig-bgp-juniper/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/openconfig-bgp-juniper/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/openconfig-bgp-juniper/run.py b/tests/openconfig-bgp-juniper/run.py index a5753d8f..4b19218b 100755 --- a/tests/openconfig-bgp-juniper/run.py +++ b/tests/openconfig-bgp-juniper/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,8 +31,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import openconfig_bgp_juniper diff --git a/tests/rpc/lib b/tests/rpc/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/rpc/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/rpc/run.py b/tests/rpc/run.py index 94313ac9..a486208d 100755 --- a/tests/rpc/run.py +++ b/tests/rpc/run.py @@ -20,13 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True - pyangpath = os.environ.get('PYANGPATH') if \ - os.environ.get('PYANGPATH') is not None else False - pyangbindpath = os.environ.get('PYANGBINDPATH') if \ - os.environ.get('PYANGBINDPATH') is not None else False - assert pyangpath is not False, "could not find path to pyang" - assert pyangbindpath is not False, "could not resolve pyangbind directory" - + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -35,12 +31,17 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system(("%s --plugindir %s -f pybind --split-class-dir=%s/bindings " + - "--build-rpc --use-xpathhelper " + - "--pybind-class-dir=%s %s/%s.yang") % (pyangpath, pyangbindpath, - this_dir, pyangbindpath, this_dir, TESTNAME)) - from lib.xpathhelper import YANGPathHelper + cmd = "%s "% pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind " + cmd += " --split-class-dir=%s/bindings" % this_dir + cmd += " -p %s" % this_dir + cmd += " --use-xpathhelper --build-rpc" + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) + + from pyangbind.lib.xpathhelper import YANGPathHelper ph = YANGPathHelper() import_error = None diff --git a/tests/run.sh b/tests/run.sh index 4b406a5d..b4b01587 100755 --- a/tests/run.sh +++ b/tests/run.sh @@ -1,7 +1,39 @@ #!/bin/bash +# Find where we are meant to run the tests TESTDIR="$(cd -P "$(dirname "${BASH_SOURCE[0]}")" && pwd)" + +# Build a virtual environment to do the tests in +cd $TESTDIR/.. +rm -rf $TESTDIR/pyvirtualenv $TESTDIR/../dist $TESTDIR/../build $TESTDIR/../pyangbind.egg-info + +echo "RUNNING PACKAGING..." +python setup.py bdist_wheel sdist > /dev/null +if [ $? -ne 0 ]; then + echo "RESULT: CANNOT RUN TESTS, BROKEN PACKAGING" + exit +fi + +echo "CREATING VIRTUALENV..." +virtualenv $TESTDIR/pyvirtualenv > /dev/null +if [ $? -ne 0 ]; then + echo "RESULT: CANNOT RUN TESTS, BROKEN VIRTUALENV" + exit +fi + +source $TESTDIR/pyvirtualenv/bin/activate + +echo "INSTALLING MODULE..." +$TESTDIR/pyvirtualenv/bin/pip install $TESTDIR/../dist/*.whl > /dev/null +if [ $? -ne 0 ]; then + echo "RESULT: CANNOT RUN TESTS, BROKEN INSTALL" + exit +fi + +export PATH_TO_PYBIND_TEST_PYTHON="$TESTDIR/pyvirtualenv/bin/python" + $TESTDIR/yang_tests.sh $TESTDIR/xpath/xpath_tests.sh $TESTDIR/serialise/serialise_tests.sh +rm -rf $TESTDIR/pyvirtualenv $TESTDIR/../dist $TESTDIR/../build $TESTDIR/../pyangbind.egg-info diff --git a/tests/serialise/json/lib b/tests/serialise/json/lib deleted file mode 120000 index a5bc7439..00000000 --- a/tests/serialise/json/lib +++ /dev/null @@ -1 +0,0 @@ -../../../lib \ No newline at end of file diff --git a/tests/serialise/json/run.py b/tests/serialise/json/run.py index 7d64e809..e9bc77ae 100755 --- a/tests/serialise/json/run.py +++ b/tests/serialise/json/run.py @@ -4,7 +4,7 @@ import sys import getopt import json -from lib.serialise import pybindJSONEncoder +from pyangbind.lib.serialise import pybindJSONEncoder TESTNAME = "serialise-json" @@ -22,6 +22,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -30,8 +33,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + + cmd = "%s "% pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import serialise_json js = serialise_json() diff --git a/tests/serialise/serialise_tests.sh b/tests/serialise/serialise_tests.sh index 80b3e690..3a49f41c 100755 --- a/tests/serialise/serialise_tests.sh +++ b/tests/serialise/serialise_tests.sh @@ -6,7 +6,14 @@ FAIL=0 TESTDIR="$(cd -P "$(dirname "${BASH_SOURCE[0]}")" && pwd)" export PYANGBINDPATH=$TESTDIR/../.. export PYANGPATH=`which pyang` -export XPATHLIBDIR=$TESTDIR/../../lib/ +#export XPATHLIBDIR=$TESTDIR/../../lib/ + +if [ -z "$PATH_TO_PYBIND_TEST_PYTHON" ]; then + echo "INFO: Testing against system pyangbind library" + PYPATH=`which python` +else + PYPATH=$PATH_TO_PYBIND_TEST_PYTHON +fi if [ $# -eq 0 ]; then FAIL=0 @@ -14,9 +21,9 @@ if [ $# -eq 0 ]; then if [ -e $i/run.py ]; then echo "TESTING $i..." if [ "$DEL" = true ]; then - $i/run.py > /dev/null + $PYPATH $i/run.py > /dev/null else - $i/run.py -k > /dev/null + $PYPATH $i/run.py -k > /dev/null fi if [ $? -ne 0 ]; then echo "TEST FAILED $i"; @@ -34,10 +41,10 @@ else if [ -e $i/run.py ]; then echo "TESTING $i..." if [ "$DEL" = true ]; then - $i/run.py + $PYPATH $i/run.py #> /dev/null else - $i/run.py -k + $PYPATH $i/run.py -k #> /dev/null fi if [ $? -ne 0 ]; then diff --git a/tests/split-classes/lib b/tests/split-classes/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/split-classes/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/split-classes/run.py b/tests/split-classes/run.py index 48d7c63c..333f454e 100755 --- a/tests/split-classes/run.py +++ b/tests/split-classes/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: keepfiles = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,9 +31,14 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system(("%s --plugindir %s -f pybind --split-class-dir=%s/bindings " + - "--pybind-class-dir=%s %s/%s.yang") % (pyangpath, pyangbindpath, - this_dir, pyangbindpath, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind " + cmd += " --split-class-dir=%s/bindings" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import split_classes diff --git a/tests/string/lib b/tests/string/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/string/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/string/run.py b/tests/string/run.py index 96843ef9..25dd2ac3 100755 --- a/tests/string/run.py +++ b/tests/string/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,8 +31,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import string test_instance = string() diff --git a/tests/typedef/lib b/tests/typedef/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/typedef/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/typedef/run.py b/tests/typedef/run.py index 78bfca52..868a4f77 100755 --- a/tests/typedef/run.py +++ b/tests/typedef/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,10 +31,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind \ - -p %s \ - -o %s/bindings.py %s/%s.yang" % (pyangpath, pyangbindpath, - this_dir, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import typedef t = typedef() diff --git a/tests/uint/lib b/tests/uint/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/uint/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/uint/run.py b/tests/uint/run.py index 8f770078..29f6fbd2 100755 --- a/tests/uint/run.py +++ b/tests/uint/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,8 +31,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import uint diff --git a/tests/union/lib b/tests/union/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/union/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/union/run.py b/tests/union/run.py index c1cff938..0daffdc6 100755 --- a/tests/union/run.py +++ b/tests/union/run.py @@ -20,6 +20,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if \ os.environ.get('PYANGPATH') is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -28,8 +31,13 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system("%s --plugindir %s -f pybind -o %s/bindings.py %s/%s.yang" % - (pyangpath, pyangbindpath, this_dir, this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import union import numpy diff --git a/tests/xpath/00_pathhelper_base.py b/tests/xpath/00_pathhelper_base.py index 71cd9406..9ecc2cf6 100755 --- a/tests/xpath/00_pathhelper_base.py +++ b/tests/xpath/00_pathhelper_base.py @@ -113,8 +113,5 @@ def t4_retr_obj_error(tree=False): "an XPathError") if __name__ == '__main__': - import_path = os.path.realpath(os.path.dirname(os.path.realpath(__file__)) + - "/../../lib") - sys.path.insert(0, import_path) - from xpathhelper import YANGPathHelper, XPathError + from pyangbind.lib.xpathhelper import YANGPathHelper, XPathError main() diff --git a/tests/xpath/01-list_leaflist/lib b/tests/xpath/01-list_leaflist/lib deleted file mode 120000 index a5bc7439..00000000 --- a/tests/xpath/01-list_leaflist/lib +++ /dev/null @@ -1 +0,0 @@ -../../../lib \ No newline at end of file diff --git a/tests/xpath/01-list_leaflist/run.py b/tests/xpath/01-list_leaflist/run.py index aa7d01a2..9b6be54c 100755 --- a/tests/xpath/01-list_leaflist/run.py +++ b/tests/xpath/01-list_leaflist/run.py @@ -19,6 +19,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if os.environ.get('PYANGPATH') \ is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -27,9 +30,15 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system(("%s --plugindir %s -f pybind -o %s/bindings.py " + - "--use-xpathhelper %s/%s.yang") % (pyangpath, pyangbindpath, this_dir, - this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " --use-xpathhelper" + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + print cmd + os.system(cmd) from bindings import list_tc01 as ytest @@ -301,8 +310,5 @@ def t7_leaflist_of_leafrefs(yobj, tree=False): "was not correctly set (%s -> %s, expected %s)" % (b[0], passed, b[1]) if __name__ == '__main__': - import_path = os.path.realpath(os.path.dirname(os.path.realpath(__file__)) + - "/../../../") - sys.path.insert(0, import_path) - from lib.xpathhelper import YANGPathHelper, XPathError + from pyangbind.lib.xpathhelper import YANGPathHelper, XPathError main() diff --git a/tests/xpath/02-static_ptr/lib b/tests/xpath/02-static_ptr/lib deleted file mode 120000 index a5bc7439..00000000 --- a/tests/xpath/02-static_ptr/lib +++ /dev/null @@ -1 +0,0 @@ -../../../lib \ No newline at end of file diff --git a/tests/xpath/02-static_ptr/run.py b/tests/xpath/02-static_ptr/run.py index 75ae71a7..4cda4287 100755 --- a/tests/xpath/02-static_ptr/run.py +++ b/tests/xpath/02-static_ptr/run.py @@ -19,6 +19,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if os.environ.get('PYANGPATH') \ is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -27,9 +30,14 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system(("%s --plugindir %s -f pybind -o %s/bindings.py " + - "--use-xpathhelper %s/%s.yang") % (pyangpath, pyangbindpath, this_dir, - this_dir, TESTNAME)) + + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " --use-xpathhelper" + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import ptr_tc02 as ytest @@ -63,10 +71,6 @@ def t1_listkey(yobj, tree=False): if del_tree: del tree - if __name__ == '__main__': - import_path = os.path.realpath(os.path.dirname(os.path.realpath(__file__)) + - "/../../..") - sys.path.insert(0, import_path) - from lib.xpathhelper import YANGPathHelper, XPathError + from pyangbind.lib.xpathhelper import YANGPathHelper, XPathError main() diff --git a/tests/xpath/03-current/lib b/tests/xpath/03-current/lib deleted file mode 120000 index a5bc7439..00000000 --- a/tests/xpath/03-current/lib +++ /dev/null @@ -1 +0,0 @@ -../../../lib \ No newline at end of file diff --git a/tests/xpath/03-current/run.py b/tests/xpath/03-current/run.py index a1962413..8d0d9531 100755 --- a/tests/xpath/03-current/run.py +++ b/tests/xpath/03-current/run.py @@ -19,6 +19,9 @@ def main(): if o in ["-k", "--keepfiles"]: k = True + pythonpath = os.environ.get("PATH_TO_PYBIND_TEST_PYTHON") if \ + os.environ.get('PATH_TO_PYBIND_TEST_PYTHON') is not None \ + else sys.executable pyangpath = os.environ.get('PYANGPATH') if os.environ.get('PYANGPATH') \ is not None else False pyangbindpath = os.environ.get('PYANGBINDPATH') if \ @@ -27,10 +30,14 @@ def main(): assert pyangbindpath is not False, "could not resolve pyangbind directory" this_dir = os.path.dirname(os.path.realpath(__file__)) - os.system(("%s --plugindir %s -f pybind -o %s/bindings.py " + - "--use-xpathhelper %s/%s.yang") % (pyangpath, pyangbindpath, this_dir, - this_dir, TESTNAME)) + cmd = "%s " % pythonpath + cmd += "%s --plugindir %s/pyangbind/plugin" % (pyangpath, pyangbindpath) + cmd += " -f pybind -o %s/bindings.py" % this_dir + cmd += " -p %s" % this_dir + cmd += " --use-xpathhelper" + cmd += " %s/%s.yang" % (this_dir, TESTNAME) + os.system(cmd) from bindings import current_tc03 yhelper = YANGPathHelper() yobj = current_tc03(path_helper=yhelper) @@ -69,10 +76,6 @@ def t1_currentref(yobj, tree=False): if del_tree: del yhelper - if __name__ == '__main__': - import_path = os.path.realpath(os.path.dirname(os.path.realpath(__file__)) + - "/../../..") - sys.path.insert(0, import_path) - from lib.xpathhelper import YANGPathHelper, XPathError + from pyangbind.lib.xpathhelper import YANGPathHelper, XPathError main() diff --git a/tests/xpath/lib b/tests/xpath/lib deleted file mode 120000 index 58677ddb..00000000 --- a/tests/xpath/lib +++ /dev/null @@ -1 +0,0 @@ -../../lib \ No newline at end of file diff --git a/tests/xpath/xpath_tests.sh b/tests/xpath/xpath_tests.sh index c387bd25..daca30ad 100755 --- a/tests/xpath/xpath_tests.sh +++ b/tests/xpath/xpath_tests.sh @@ -5,8 +5,15 @@ TESTDIR="$(cd -P "$(dirname "${BASH_SOURCE[0]}")" && pwd)" export PYANGBINDPATH=$TESTDIR/../.. export PYANGPATH=`which pyang` +if [ -z "$PATH_TO_PYBIND_TEST_PYTHON" ]; then + echo "INFO: Testing against system pyangbind library" + PYPATH=`which python` +else + PYPATH=$PATH_TO_PYBIND_TEST_PYTHON +fi + echo "RUNNING BASE" -/usr/bin/env python $TESTDIR/00_pathhelper_base.py >/dev/null +$PYPATH $TESTDIR/00_pathhelper_base.py >/dev/null if [ $? -ne 0 ]; then echo "RESULT: CANNOT RUN TESTS, BROKEN PLUGIN" exit @@ -17,9 +24,9 @@ if [ $# -eq 0 ]; then for i in `find $TESTDIR -mindepth 1 -maxdepth 1 -type d`; do echo "TESTING $i..." if [ "$DEL" = true ]; then - $i/run.py > /dev/null + $PYPATH $i/run.py > /dev/null else - $i/run.py -k > /dev/null + $PYPATH $i/run.py -k > /dev/null fi if [ $? -ne 0 ]; then echo "TEST FAILED $i"; @@ -35,9 +42,9 @@ else for i in "$@"; do echo "TESTING $i..." if [ "$DEL" = true ]; then - $i/run.py + $PYPATH $i/run.py else - $i/run.py -k + $PYPATH $i/run.py -k fi if [ $? -ne 0 ]; then echo "TEST FAILED $i"; diff --git a/tests/yang_tests.sh b/tests/yang_tests.sh index 498baad0..e88a7aac 100755 --- a/tests/yang_tests.sh +++ b/tests/yang_tests.sh @@ -8,8 +8,15 @@ export PYANGBINDPATH=$TESTDIR/.. export PYANGPATH=`which pyang` export XPATHLIBDIR=$TESTDIR/../lib/ +if [ -z "$PATH_TO_PYBIND_TEST_PYTHON" ]; then + echo "INFO: Testing against system pyangbind library" + PYPATH=`which python` +else + PYPATH=$PATH_TO_PYBIND_TEST_PYTHON +fi + echo "RUNNING BASE" -$PYANGPATH --plugindir $PYANGBINDPATH -f pybind $TESTDIR/base-test.yang -o /tmp/chkplugin.pyang >/dev/null +$PYPATH $PYANGPATH --plugindir $PYANGBINDPATH/pyangbind/plugin -f pybind $TESTDIR/base-test.yang -o /tmp/chkplugin.pyang >/dev/null if [ $? -ne 0 ]; then echo "RESULT: CANNOT RUN TESTS, BROKEN PLUGIN" exit @@ -22,9 +29,9 @@ if [ $# -eq 0 ]; then if [ -e $i/run.py ]; then echo "TESTING $i..." if [ "$DEL" = true ]; then - $i/run.py > /dev/null + $PYPATH $i/run.py > /dev/null else - $i/run.py -k > /dev/null + $PYPATH $i/run.py -k > /dev/null fi if [ $? -ne 0 ]; then echo "TEST FAILED $i"; @@ -42,9 +49,9 @@ else if [ -e $i/run.py ]; then echo "TESTING $i..." if [ "$DEL" = true ]; then - $i/run.py + $PYPATH $i/run.py else - $i/run.py -k + $PYPATH $i/run.py -k fi if [ $? -ne 0 ]; then echo "TEST FAILED $i";