Add introspection rules support
This patch introduces a simple JSON-based DSL to run on introspected data. Conditions and actions are provided via new plugin entry points. This patch is missing PUT operation on a rule, this can be added later. Also not all planned conditions and actions are added in this patch, will also follow up. Implements: blueprint rules Change-Id: If4d17b5f1462d03879cb4c2ff4e5cb3ea364b697
This commit is contained in:
parent
f21eb0ab8c
commit
eb9b3da67a
@ -175,8 +175,63 @@ Writing a Plugin
|
||||
``ironic_inspector.hooks.node_not_found`` namespace and enable it in the
|
||||
configuration file (``processing.node_not_found_hook`` option).
|
||||
|
||||
* **ironic-inspector** allows more condition types to be added for
|
||||
`Introspection Rules`_. Inherit ``RuleConditionPlugin`` class defined in
|
||||
ironic_inspector.plugins.base_ module and overwrite at least the following
|
||||
method:
|
||||
|
||||
``check(node_info,field,params,**)``
|
||||
called to check that condition holds for a given field. Field value is
|
||||
provided as ``field`` argument, ``params`` is a dictionary defined
|
||||
at the time of condition creation. Returns boolean value.
|
||||
|
||||
The following methods and attributes may also be overridden:
|
||||
|
||||
``validate(params,**)``
|
||||
called to validate parameters provided during condition creating.
|
||||
Default implementation requires keys listed in ``REQUIRED_PARAMS`` (and
|
||||
only them).
|
||||
|
||||
``REQUIRED_PARAMS``
|
||||
contains set of required parameters used in the default implementation
|
||||
of ``validate`` method, defaults to ``value`` parameter.
|
||||
|
||||
``ALLOW_NONE``
|
||||
if it's set to ``True``, missing fields will be passed as ``None``
|
||||
values instead of failing the condition. Defaults to ``False``.
|
||||
|
||||
Make your plugin a setuptools entry point under
|
||||
``ironic_inspector.rules.conditions`` namespace.
|
||||
|
||||
* **ironic-inspector** allows more action types to be added for `Introspection
|
||||
Rules`_. Inherit ``RuleActionPlugin`` class defined in
|
||||
ironic_inspector.plugins.base_ module and overwrite at least the following
|
||||
method:
|
||||
|
||||
``apply(node_info,params,**)``
|
||||
called to apply the action.
|
||||
|
||||
The following methods and attributes may also be overridden:
|
||||
|
||||
``rollback(node_info,params,**)``
|
||||
called to clean up when conditions were not met.
|
||||
Default implementation does nothing.
|
||||
|
||||
``validate(params,**)``
|
||||
called to validate parameters provided during actions creating.
|
||||
Default implementation requires keys listed in ``REQUIRED_PARAMS`` (and
|
||||
only them).
|
||||
|
||||
``REQUIRED_PARAMS``
|
||||
contains set of required parameters used in the default implementation
|
||||
of ``validate`` method, defaults to no parameters.
|
||||
|
||||
Make your plugin a setuptools entry point under
|
||||
``ironic_inspector.rules.conditions`` namespace.
|
||||
|
||||
.. note::
|
||||
``**`` argument is needed so that we can add optional arguments without
|
||||
breaking out-of-tree plugins. Please make sure to include and ignore it.
|
||||
|
||||
.. _ironic_inspector.plugins.base: https://github.com/openstack/ironic-inspector/blob/master/ironic_inspector/plugins/base.py
|
||||
.. _Introspection Rules: https://github.com/openstack/ironic-inspector#introspection-rules
|
||||
|
65
HTTP-API.rst
65
HTTP-API.rst
@ -67,6 +67,70 @@ Response:
|
||||
|
||||
Response body: JSON dictionary with introspection data
|
||||
|
||||
Introspection Rules
|
||||
~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
See `Introspection Rules documentation`_ for details.
|
||||
|
||||
All these API endpoints require X-Auth-Token header with Keystone token for
|
||||
authentication.
|
||||
|
||||
* ``POST /v1/rules`` create a new introspection rule.
|
||||
|
||||
Request body: JSON dictionary with keys:
|
||||
|
||||
* ``conditions`` rule conditions, see `Introspection Rules documentation`_
|
||||
* ``actions`` rule actions, see `Introspection Rules documentation`_
|
||||
* ``description`` (optional) human-readable description
|
||||
* ``uuid`` (optional) rule UUID, autogenerated if missing
|
||||
|
||||
Response
|
||||
|
||||
* 200 - OK
|
||||
* 400 - bad request
|
||||
|
||||
Response body: JSON dictionary with introspection rule representation (the
|
||||
same as above with UUID filled in).
|
||||
|
||||
* ``GET /v1/rules`` list all introspection rules.
|
||||
|
||||
Response
|
||||
|
||||
* 200 - OK
|
||||
|
||||
Response body: JSON dictionary with key ``rules`` - list of short rule
|
||||
representations. Short rule representation is a JSON dictionary with keys:
|
||||
|
||||
* ``uuid`` rule UUID
|
||||
* ``description`` human-readable description
|
||||
* ``links`` list of HTTP links, use one with ``rel=self`` to get the full
|
||||
rule details
|
||||
|
||||
* ``DELETE /v1/rules`` delete all introspection rules.
|
||||
|
||||
Response
|
||||
|
||||
* 204 - OK
|
||||
|
||||
* ``GET /v1/rules/<UUID>`` get one introspection rule by its ``<UUID>``.
|
||||
|
||||
Response
|
||||
|
||||
* 200 - OK
|
||||
* 404 - not found
|
||||
|
||||
Response body: JSON dictionary with introspection rule representation
|
||||
(see ``POST /v1/rules`` above).
|
||||
|
||||
* ``DELETE /v1/rules/<UUID>`` delete one introspection rule by its ``<UUID>``.
|
||||
|
||||
Response
|
||||
|
||||
* 204 - OK
|
||||
* 404 - not found
|
||||
|
||||
.. _Introspection Rules documentation: https://github.com/openstack/ironic-inspector#introspection-rules
|
||||
|
||||
Ramdisk Callback
|
||||
~~~~~~~~~~~~~~~~
|
||||
|
||||
@ -169,3 +233,4 @@ Version History
|
||||
|
||||
**1.0** version of API at the moment of introducing versioning.
|
||||
**1.1** adds endpoint to retrieve stored introspection data.
|
||||
**1.2** endpoints for manipulating introspection rules.
|
||||
|
55
README.rst
55
README.rst
@ -277,6 +277,61 @@ Node States
|
||||
before Nova becomes aware of available nodes after issuing this command.
|
||||
Use ``nova hypervisor-stats`` command output to check it.
|
||||
|
||||
Introspection Rules
|
||||
~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Inspector supports a simple JSON-based DSL to define rules to run during
|
||||
introspection. Inspector provides an API to manage such rules, and will run
|
||||
them automatically after running all processing hooks.
|
||||
|
||||
A rule consists of conditions to check, and actions to run. If conditions
|
||||
evaluate to true on the introspection data, then actions are run on a node.
|
||||
All actions have "rollback actions" associated with them, which are run when
|
||||
conditions evaluate to false. This way we can safely rerun introspection.
|
||||
|
||||
Available conditions and actions are defined by plugins, and can be extended,
|
||||
see CONTRIBUTING.rst_ for details. See `HTTP API`_ for specific calls to define
|
||||
introspection rules.
|
||||
|
||||
Conditions
|
||||
^^^^^^^^^^
|
||||
|
||||
A condition is represented by an object with fields:
|
||||
|
||||
``op`` the type of comparison operation, default available operators include :
|
||||
``eq``, ``le``, ``ge``, ``ne``, ``lt``, ``gt``.
|
||||
|
||||
``field`` a `JSON path <http://goessner.net/articles/JsonPath/>`_ to the field
|
||||
in the introspection data to use in comparison.
|
||||
|
||||
``multiple`` how to treat situations where the ``field`` query returns multiple
|
||||
results (e.g. the field contains a list), available options are:
|
||||
|
||||
* ``any`` (the default) require any to match,
|
||||
* ``all`` require all to match,
|
||||
* ``first`` requrie the first to match.
|
||||
|
||||
All other fields are passed to the condition plugin, e.g. numeric comparison
|
||||
operations require a ``value`` field to compare against.
|
||||
|
||||
Actions
|
||||
^^^^^^^
|
||||
|
||||
An action is represented by an object with fields:
|
||||
|
||||
``action`` type of action. Possible values are defined by plugins.
|
||||
|
||||
All other fields are passed to the action plugin.
|
||||
|
||||
Default available actions include:
|
||||
|
||||
* ``fail`` fail introspection. Requires a ``message`` parameter for the failure
|
||||
message.
|
||||
|
||||
* ``set-attribute`` sets an attribute on an Ironic node. Requires a ``path``
|
||||
field, which is the path to the attribute as used by ironic (e.g.
|
||||
``/properties/something``), and a ``value`` to set.
|
||||
|
||||
Setting IPMI Credentials
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
|
@ -7,6 +7,37 @@ export IRONIC_API_VERSION=${IRONIC_API_VERSION:-latest}
|
||||
# Copied from devstack
|
||||
PRIVATE_NETWORK_NAME=${PRIVATE_NETWORK_NAME:-"private"}
|
||||
|
||||
successful_rule=$(mktemp)
|
||||
cat > "$successful_rule" << EOM
|
||||
{
|
||||
"description": "Successful Rule",
|
||||
"conditions": [
|
||||
{"op": "ge", "field": "memory_mb", "value": 256},
|
||||
{"op": "ge", "field": "local_gb", "value": 1}
|
||||
],
|
||||
"actions": [
|
||||
{"action": "set-attribute", "path": "/extra/rule_success",
|
||||
"value": "yes"}
|
||||
]
|
||||
}
|
||||
EOM
|
||||
|
||||
failing_rule=$(mktemp)
|
||||
cat > "$failing_rule" << EOM
|
||||
{
|
||||
"description": "Failing Rule",
|
||||
"conditions": [
|
||||
{"op": "lt", "field": "memory_mb", "value": 42},
|
||||
{"op": "eq", "field": "local_gb", "value": 0}
|
||||
],
|
||||
"actions": [
|
||||
{"action": "set-attribute", "path": "/extra/rule_success",
|
||||
"value": "no"},
|
||||
{"action": "fail", "message": "This rule should not have run"}
|
||||
]
|
||||
}
|
||||
EOM
|
||||
|
||||
expected_cpus=$(openstack flavor show baremetal -f value -c vcpus)
|
||||
expected_memory_mb=$(openstack flavor show baremetal -f value -c ram)
|
||||
expected_cpu_arch=$(openstack flavor show baremetal -f value -c properties | sed "s/.*cpu_arch='\([^']*\)'.*/\1/")
|
||||
@ -21,6 +52,20 @@ if [ -z "$ironic_url" ]; then
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# NOTE(dtantsur): it's hard to get JSON field from Ironic client output, using
|
||||
# HTTP API and JQ instead.
|
||||
|
||||
function curl_ir {
|
||||
local token=$(keystone token-get | grep ' id ' | tr '|' ' ' | awk '{ print $2; }')
|
||||
curl -H "X-Auth-Token: $token" -X $1 "$ironic_url/$2"
|
||||
}
|
||||
|
||||
function curl_ins {
|
||||
local token=$(keystone token-get | grep ' id ' | tr '|' ' ' | awk '{ print $2; }')
|
||||
local args=${3:-}
|
||||
curl -f -H "X-Auth-Token: $token" -X $1 $args "http://127.0.0.1:5050/$2"
|
||||
}
|
||||
|
||||
nodes=$(ironic node-list | tail -n +4 | head -n -1 | tr '|' ' ' | awk '{ print $1; }')
|
||||
if [ -z "$nodes" ]; then
|
||||
echo "No nodes found in Ironic"
|
||||
@ -34,6 +79,10 @@ for uuid in $nodes; do
|
||||
ironic node-set-provision-state $uuid manage
|
||||
done
|
||||
|
||||
curl_ins DELETE v1/rules
|
||||
curl_ins POST v1/rules "--data-binary @$successful_rule"
|
||||
curl_ins POST v1/rules "--data-binary @$failing_rule"
|
||||
|
||||
for uuid in $nodes; do
|
||||
ironic node-set-provision-state $uuid inspect
|
||||
done
|
||||
@ -63,17 +112,7 @@ while true; do
|
||||
fi
|
||||
done
|
||||
|
||||
# NOTE(dtantsur): it's hard to get JSON field from Ironic client output, using
|
||||
# HTTP API and JQ instead.
|
||||
token=$(keystone token-get | grep ' id ' | tr '|' ' ' | awk '{ print $2; }')
|
||||
|
||||
function curl_ir {
|
||||
curl -H "X-Auth-Token: $token" -X $1 "$ironic_url/$2"
|
||||
}
|
||||
|
||||
function curl_ins {
|
||||
curl -H "X-Auth-Token: $token" -X $1 "http://127.0.0.1:5050/$2"
|
||||
}
|
||||
curl_ins DELETE v1/rules
|
||||
|
||||
function test_swift {
|
||||
# Basic sanity check of the data stored in Swift
|
||||
@ -108,6 +147,13 @@ for uuid in $nodes; do
|
||||
exit 1
|
||||
fi
|
||||
|
||||
extra=$(echo $node_json | jq '.extra')
|
||||
echo Extra properties for $uuid: $extra
|
||||
if [ "$(echo $extra | jq -r '.rule_success')" != "yes" ]; then
|
||||
echo "Rule matching failed"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
openstack service list | grep swift && test_swift
|
||||
|
||||
for attempt in {1..12}; do
|
||||
|
@ -21,8 +21,11 @@ from oslo_config import cfg
|
||||
from oslo_db import options as db_opts
|
||||
from oslo_db.sqlalchemy import models
|
||||
from oslo_db.sqlalchemy import session as db_session
|
||||
from sqlalchemy import Column, Float, ForeignKey, String, Text
|
||||
from oslo_db.sqlalchemy import types as db_types
|
||||
from sqlalchemy import (Boolean, Column, DateTime, Float, ForeignKey, Integer,
|
||||
String, Text)
|
||||
from sqlalchemy.ext.declarative import declarative_base
|
||||
from sqlalchemy import orm
|
||||
|
||||
|
||||
Base = declarative_base(cls=models.ModelBase)
|
||||
@ -53,6 +56,53 @@ class Option(Base):
|
||||
value = Column(Text)
|
||||
|
||||
|
||||
class Rule(Base):
|
||||
__tablename__ = 'rules'
|
||||
uuid = Column(String(36), primary_key=True)
|
||||
created_at = Column(DateTime, nullable=False)
|
||||
description = Column(Text)
|
||||
# NOTE(dtantsur): in the future we might need to temporary disable a rule
|
||||
disabled = Column(Boolean, default=False)
|
||||
|
||||
conditions = orm.relationship('RuleCondition', lazy='joined',
|
||||
order_by='RuleCondition.id',
|
||||
cascade="all, delete-orphan")
|
||||
actions = orm.relationship('RuleAction', lazy='joined',
|
||||
order_by='RuleAction.id',
|
||||
cascade="all, delete-orphan")
|
||||
|
||||
|
||||
class RuleCondition(Base):
|
||||
__tablename__ = 'rule_conditions'
|
||||
id = Column(Integer, primary_key=True)
|
||||
rule = Column(String(36), ForeignKey('rules.uuid'))
|
||||
op = Column(String(255), nullable=False)
|
||||
multiple = Column(String(255), nullable=False)
|
||||
# NOTE(dtantsur): while all operations now require a field, I can also
|
||||
# imagine user-defined operations that do not, thus it's nullable.
|
||||
field = Column(Text)
|
||||
params = Column(db_types.JsonEncodedDict)
|
||||
|
||||
def as_dict(self):
|
||||
res = self.params.copy()
|
||||
res['op'] = self.op
|
||||
res['field'] = self.field
|
||||
return res
|
||||
|
||||
|
||||
class RuleAction(Base):
|
||||
__tablename__ = 'rule_actions'
|
||||
id = Column(Integer, primary_key=True)
|
||||
rule = Column(String(36), ForeignKey('rules.uuid'))
|
||||
action = Column(String(255), nullable=False)
|
||||
params = Column(db_types.JsonEncodedDict)
|
||||
|
||||
def as_dict(self):
|
||||
res = self.params.copy()
|
||||
res['action'] = self.action
|
||||
return res
|
||||
|
||||
|
||||
def init():
|
||||
"""Initialize the database."""
|
||||
if CONF.discoverd.database:
|
||||
|
@ -15,7 +15,6 @@ import eventlet
|
||||
eventlet.monkey_patch()
|
||||
|
||||
import functools
|
||||
import json
|
||||
import ssl
|
||||
import sys
|
||||
|
||||
@ -34,6 +33,7 @@ from ironic_inspector import introspect
|
||||
from ironic_inspector import node_cache
|
||||
from ironic_inspector.plugins import base as plugins_base
|
||||
from ironic_inspector import process
|
||||
from ironic_inspector import rules
|
||||
from ironic_inspector import utils
|
||||
|
||||
CONF = cfg.CONF
|
||||
@ -43,7 +43,7 @@ app = flask.Flask(__name__)
|
||||
LOG = log.getLogger('ironic_inspector.main')
|
||||
|
||||
MINIMUM_API_VERSION = (1, 0)
|
||||
CURRENT_API_VERSION = (1, 1)
|
||||
CURRENT_API_VERSION = (1, 2)
|
||||
_MIN_VERSION_HEADER = 'X-OpenStack-Ironic-Inspector-API-Minimum-Version'
|
||||
_MAX_VERSION_HEADER = 'X-OpenStack-Ironic-Inspector-API-Maximum-Version'
|
||||
_VERSION_HEADER = 'X-OpenStack-Ironic-Inspector-API-Version'
|
||||
@ -114,7 +114,7 @@ def add_version_headers(res):
|
||||
def api_root():
|
||||
# TODO(dtantsur): this endpoint only returns API version now, it's possible
|
||||
# we'll return something meaningful in addition later
|
||||
return '{}', 200, {'Content-Type': 'application/json'}
|
||||
return flask.jsonify({})
|
||||
|
||||
|
||||
@app.route('/v1/continue', methods=['POST'])
|
||||
@ -123,8 +123,7 @@ def api_continue():
|
||||
data = flask.request.get_json(force=True)
|
||||
LOG.debug("/v1/continue got JSON %s", data)
|
||||
|
||||
res = process.process(data)
|
||||
return json.dumps(res), 200, {'Content-Type': 'applications/json'}
|
||||
return flask.jsonify(process.process(data))
|
||||
|
||||
|
||||
@app.route('/v1/introspection/<uuid>', methods=['GET', 'POST'])
|
||||
@ -163,7 +162,7 @@ def api_introspection_data(uuid):
|
||||
utils.check_auth(flask.request)
|
||||
if CONF.processing.store_data == 'swift':
|
||||
res = swift.get_introspection_data(uuid)
|
||||
return res, 200, {'Content-Type': 'applications/json'}
|
||||
return res, 200, {'Content-Type': 'application/json'}
|
||||
else:
|
||||
return error_response(_('Inspector is not configured to store data. '
|
||||
'Set the [processing] store_data '
|
||||
@ -171,6 +170,51 @@ def api_introspection_data(uuid):
|
||||
code=404)
|
||||
|
||||
|
||||
def rule_repr(rule, short):
|
||||
result = rule.as_dict(short=short)
|
||||
result['links'] = [{
|
||||
'href': flask.url_for('api_rule', uuid=result['uuid']),
|
||||
'rel': 'self'
|
||||
}]
|
||||
return result
|
||||
|
||||
|
||||
@app.route('/v1/rules', methods=['GET', 'POST', 'DELETE'])
|
||||
@convert_exceptions
|
||||
def api_rules():
|
||||
utils.check_auth(flask.request)
|
||||
|
||||
if flask.request.method == 'GET':
|
||||
res = [rule_repr(rule, short=True) for rule in rules.get_all()]
|
||||
return flask.jsonify(rules=res)
|
||||
elif flask.request.method == 'DELETE':
|
||||
rules.delete_all()
|
||||
return '', 204
|
||||
else:
|
||||
body = flask.request.get_json(force=True)
|
||||
if body.get('uuid') and not uuidutils.is_uuid_like(body['uuid']):
|
||||
raise utils.Error(_('Invalid UUID value'), code=400)
|
||||
|
||||
rule = rules.create(conditions_json=body.get('conditions', []),
|
||||
actions_json=body.get('actions', []),
|
||||
uuid=body.get('uuid'),
|
||||
description=body.get('description'))
|
||||
return flask.jsonify(rule_repr(rule, short=False))
|
||||
|
||||
|
||||
@app.route('/v1/rules/<uuid>', methods=['GET', 'DELETE'])
|
||||
@convert_exceptions
|
||||
def api_rule(uuid):
|
||||
utils.check_auth(flask.request)
|
||||
|
||||
if flask.request.method == 'GET':
|
||||
rule = rules.get(uuid)
|
||||
return flask.jsonify(rule_repr(rule, short=False))
|
||||
else:
|
||||
rules.delete(uuid)
|
||||
return '', 204
|
||||
|
||||
|
||||
@app.errorhandler(404)
|
||||
def handle_404(error):
|
||||
return error_response(error, code=404)
|
||||
|
@ -251,6 +251,23 @@ class NodeInfo(object):
|
||||
self.ironic.port.delete(port.uuid)
|
||||
del ports[port.address]
|
||||
|
||||
def get_by_path(self, path):
|
||||
"""Get field value by ironic-style path (e.g. /extra/foo).
|
||||
|
||||
:param path: path to a field
|
||||
:returns: field value
|
||||
:raises: KeyError if field was not found
|
||||
"""
|
||||
path = path.strip('/')
|
||||
try:
|
||||
if '/' in path:
|
||||
prop, key = path.split('/', 1)
|
||||
return getattr(self.node(), prop)[key]
|
||||
else:
|
||||
return getattr(self.node(), path)
|
||||
except AttributeError:
|
||||
raise KeyError(path)
|
||||
|
||||
|
||||
def add_node(uuid, **attributes):
|
||||
"""Store information about a node under introspection.
|
||||
|
@ -17,8 +17,9 @@ import abc
|
||||
|
||||
from oslo_config import cfg
|
||||
import six
|
||||
from stevedore import driver
|
||||
from stevedore import named
|
||||
import stevedore
|
||||
|
||||
from ironic_inspector.common.i18n import _
|
||||
|
||||
|
||||
CONF = cfg.CONF
|
||||
@ -71,8 +72,93 @@ class ProcessingHook(object): # pragma: no cover
|
||||
"""
|
||||
|
||||
|
||||
class WithValidation(object):
|
||||
REQUIRED_PARAMS = set()
|
||||
"""Set with names of required parameters."""
|
||||
|
||||
OPTIONAL_PARAMS = set()
|
||||
"""Set with names of optional parameters."""
|
||||
|
||||
def validate(self, params, **kwargs):
|
||||
"""Validate params passed during creation.
|
||||
|
||||
Default implementation checks for presence of fields from
|
||||
REQUIRED_PARAMS and fails for unexpected fields (not from
|
||||
REQUIRED_PARAMS + OPTIONAL_PARAMS).
|
||||
|
||||
:param params: params as a dictionary
|
||||
:param kwargs: used for extensibility without breaking existing plugins
|
||||
:raises: ValueError on validation failure
|
||||
"""
|
||||
passed = {k for k, v in params.items() if v is not None}
|
||||
missing = self.REQUIRED_PARAMS - passed
|
||||
unexpected = passed - self.REQUIRED_PARAMS - self.OPTIONAL_PARAMS
|
||||
|
||||
msg = []
|
||||
if missing:
|
||||
msg.append(_('missing required parameter(s): %s')
|
||||
% ', '.join(missing))
|
||||
if unexpected:
|
||||
msg.append(_('unexpected parameter(s): %s')
|
||||
% ', '.join(unexpected))
|
||||
|
||||
if msg:
|
||||
raise ValueError('; '.join(msg))
|
||||
|
||||
|
||||
@six.add_metaclass(abc.ABCMeta)
|
||||
class RuleConditionPlugin(WithValidation): # pragma: no cover
|
||||
"""Abstract base class for rule condition plugins."""
|
||||
|
||||
REQUIRED_PARAMS = {'value'}
|
||||
|
||||
ALLOW_NONE = False
|
||||
"""Whether this condition accepts None when field is not found."""
|
||||
|
||||
@abc.abstractmethod
|
||||
def check(self, node_info, field, params, **kwargs):
|
||||
"""Check if condition holds for a given field.
|
||||
|
||||
:param node_info: NodeInfo object
|
||||
:param field: field value
|
||||
:param params: parameters as a dictionary, changing it here will change
|
||||
what will be stored in database
|
||||
:param kwargs: used for extensibility without breaking existing plugins
|
||||
:raises ValueError: on unacceptable field value
|
||||
:returns: True if check succeeded, otherwise False
|
||||
"""
|
||||
|
||||
|
||||
@six.add_metaclass(abc.ABCMeta)
|
||||
class RuleActionPlugin(WithValidation): # pragma: no cover
|
||||
"""Abstract base class for rule action plugins."""
|
||||
|
||||
@abc.abstractmethod
|
||||
def apply(self, node_info, params, **kwargs):
|
||||
"""Run action on successful rule match.
|
||||
|
||||
:param node_info: NodeInfo object
|
||||
:param params: parameters as a dictionary
|
||||
:param kwargs: used for extensibility without breaking existing plugins
|
||||
:raises: utils.Error on failure
|
||||
"""
|
||||
|
||||
def rollback(self, node_info, params, **kwargs):
|
||||
"""Rollback action effects from previous run on a failed match.
|
||||
|
||||
Default implementation does nothing.
|
||||
|
||||
:param node_info: NodeInfo object
|
||||
:param params: parameters as a dictionary
|
||||
:param kwargs: used for extensibility without breaking existing plugins
|
||||
:raises: utils.Error on failure
|
||||
"""
|
||||
|
||||
|
||||
_HOOKS_MGR = None
|
||||
_NOT_FOUND_HOOK_MGR = None
|
||||
_CONDITIONS_MGR = None
|
||||
_ACTIONS_MGR = None
|
||||
|
||||
|
||||
def processing_hooks_manager(*args):
|
||||
@ -85,7 +171,7 @@ def processing_hooks_manager(*args):
|
||||
names = [x.strip()
|
||||
for x in CONF.processing.processing_hooks.split(',')
|
||||
if x.strip()]
|
||||
_HOOKS_MGR = named.NamedExtensionManager(
|
||||
_HOOKS_MGR = stevedore.NamedExtensionManager(
|
||||
'ironic_inspector.hooks.processing',
|
||||
names=names,
|
||||
invoke_on_load=True,
|
||||
@ -99,8 +185,28 @@ def node_not_found_hook_manager(*args):
|
||||
if _NOT_FOUND_HOOK_MGR is None:
|
||||
name = CONF.processing.node_not_found_hook
|
||||
if name:
|
||||
_NOT_FOUND_HOOK_MGR = driver.DriverManager(
|
||||
_NOT_FOUND_HOOK_MGR = stevedore.DriverManager(
|
||||
'ironic_inspector.hooks.node_not_found',
|
||||
name=name)
|
||||
|
||||
return _NOT_FOUND_HOOK_MGR
|
||||
|
||||
|
||||
def rule_conditions_manager():
|
||||
"""Create a Stevedore extension manager for conditions in rules."""
|
||||
global _CONDITIONS_MGR
|
||||
if _CONDITIONS_MGR is None:
|
||||
_CONDITIONS_MGR = stevedore.ExtensionManager(
|
||||
'ironic_inspector.rules.conditions',
|
||||
invoke_on_load=True)
|
||||
return _CONDITIONS_MGR
|
||||
|
||||
|
||||
def rule_actions_manager():
|
||||
"""Create a Stevedore extension manager for actions in rules."""
|
||||
global _ACTIONS_MGR
|
||||
if _ACTIONS_MGR is None:
|
||||
_ACTIONS_MGR = stevedore.ExtensionManager(
|
||||
'ironic_inspector.rules.actions',
|
||||
invoke_on_load=True)
|
||||
return _ACTIONS_MGR
|
||||
|
@ -33,3 +33,11 @@ class ExampleProcessingHook(base.ProcessingHook): # pragma: no cover
|
||||
|
||||
def example_not_found_hook(introspection_data, **kwargs):
|
||||
LOG.debug('Processing node not found %s', introspection_data)
|
||||
|
||||
|
||||
class ExampleRuleAction(base.RuleActionPlugin): # pragma: no cover
|
||||
def apply(self, node_info, params, **kwargs):
|
||||
LOG.debug('apply action to %s: %s', node_info.uuid, params)
|
||||
|
||||
def rollback(self, node_info, params, **kwargs):
|
||||
LOG.debug('rollback action to %s: %s', node_info.uuid, params)
|
||||
|
92
ironic_inspector/plugins/rules.py
Normal file
92
ironic_inspector/plugins/rules.py
Normal file
@ -0,0 +1,92 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
"""Standard plugins for rules API."""
|
||||
|
||||
import operator
|
||||
|
||||
from oslo_log import log
|
||||
|
||||
from ironic_inspector.plugins import base
|
||||
from ironic_inspector import utils
|
||||
|
||||
|
||||
LOG = log.getLogger(__name__)
|
||||
|
||||
|
||||
def coerce(value, expected):
|
||||
if isinstance(expected, float):
|
||||
return float(value)
|
||||
elif isinstance(expected, int):
|
||||
return int(value)
|
||||
else:
|
||||
return value
|
||||
|
||||
|
||||
class SimpleCondition(base.RuleConditionPlugin):
|
||||
op = None
|
||||
|
||||
def check(self, node_info, field, params, **kwargs):
|
||||
value = params['value']
|
||||
return self.op(coerce(field, value), value)
|
||||
|
||||
|
||||
class EqCondition(SimpleCondition):
|
||||
op = operator.eq
|
||||
|
||||
|
||||
class LtCondition(SimpleCondition):
|
||||
op = operator.lt
|
||||
|
||||
|
||||
class GtCondition(SimpleCondition):
|
||||
op = operator.gt
|
||||
|
||||
|
||||
class LeCondition(SimpleCondition):
|
||||
op = operator.le
|
||||
|
||||
|
||||
class GeCondition(SimpleCondition):
|
||||
op = operator.ge
|
||||
|
||||
|
||||
class NeCondition(SimpleCondition):
|
||||
op = operator.ne
|
||||
|
||||
|
||||
class FailAction(base.RuleActionPlugin):
|
||||
REQUIRED_PARAMS = {'message'}
|
||||
|
||||
def apply(self, node_info, params, **kwargs):
|
||||
raise utils.Error(params['message'])
|
||||
|
||||
|
||||
class SetAttributeAction(base.RuleActionPlugin):
|
||||
REQUIRED_PARAMS = {'path', 'value'}
|
||||
# TODO(dtantsur): proper validation of path
|
||||
|
||||
def apply(self, node_info, params, **kwargs):
|
||||
node_info.patch([{'op': 'add', 'path': params['path'],
|
||||
'value': params['value']}])
|
||||
|
||||
def rollback(self, node_info, params, **kwargs):
|
||||
try:
|
||||
node_info.get_by_path(params['path'])
|
||||
except KeyError:
|
||||
LOG.debug('Field %(path)s was not set on node %(node)s, '
|
||||
'no need for rollback',
|
||||
{'path': params['path'], 'node': node_info.uuid})
|
||||
return
|
||||
|
||||
node_info.patch([{'op': 'remove', 'path': params['path']}])
|
@ -23,6 +23,7 @@ from ironic_inspector.common import swift
|
||||
from ironic_inspector import firewall
|
||||
from ironic_inspector import node_cache
|
||||
from ironic_inspector.plugins import base as plugins_base
|
||||
from ironic_inspector import rules
|
||||
from ironic_inspector import utils
|
||||
|
||||
CONF = cfg.CONF
|
||||
@ -166,6 +167,9 @@ def _process_node(node, introspection_data, node_info):
|
||||
ironic = utils.get_client()
|
||||
firewall.update_filters(ironic)
|
||||
|
||||
node_info.invalidate_cache()
|
||||
rules.apply(node_info, introspection_data)
|
||||
|
||||
resp = {'uuid': node.uuid}
|
||||
|
||||
if node_info.options.get('new_ipmi_credentials'):
|
||||
|
381
ironic_inspector/rules.py
Normal file
381
ironic_inspector/rules.py
Normal file
@ -0,0 +1,381 @@
|
||||
# Copyright 2015 Red Hat, Inc.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Support for introspection rules."""
|
||||
|
||||
import jsonpath_rw as jsonpath
|
||||
import jsonschema
|
||||
from oslo_db import exception as db_exc
|
||||
from oslo_log import log
|
||||
from oslo_utils import timeutils
|
||||
from oslo_utils import uuidutils
|
||||
from sqlalchemy import orm
|
||||
|
||||
from ironic_inspector.common.i18n import _, _LE, _LI
|
||||
from ironic_inspector import db
|
||||
from ironic_inspector.plugins import base as plugins_base
|
||||
from ironic_inspector import utils
|
||||
|
||||
|
||||
LOG = log.getLogger(__name__)
|
||||
_CONDITIONS_SCHEMA = None
|
||||
_ACTIONS_SCHEMA = None
|
||||
|
||||
|
||||
def conditions_schema():
|
||||
global _CONDITIONS_SCHEMA
|
||||
if _CONDITIONS_SCHEMA is None:
|
||||
condition_plugins = [x.name for x in
|
||||
plugins_base.rule_conditions_manager()]
|
||||
_CONDITIONS_SCHEMA = {
|
||||
"title": "Inspector rule conditions schema",
|
||||
"type": "array",
|
||||
# we can have rules that always apply
|
||||
"minItems": 0,
|
||||
"items": {
|
||||
"type": "object",
|
||||
# field might become optional in the future, but not right now
|
||||
"required": ["op", "field"],
|
||||
"properties": {
|
||||
"op": {
|
||||
"description": "condition operator",
|
||||
"enum": condition_plugins
|
||||
},
|
||||
"field": {
|
||||
"description": "JSON path to field for matching",
|
||||
"type": "string"
|
||||
},
|
||||
"multiple": {
|
||||
"description": "how to treat multiple values",
|
||||
"enum": ["all", "any", "first"]
|
||||
},
|
||||
},
|
||||
# other properties are validated by plugins
|
||||
"additionalProperties": True
|
||||
}
|
||||
}
|
||||
|
||||
return _CONDITIONS_SCHEMA
|
||||
|
||||
|
||||
def actions_schema():
|
||||
global _ACTIONS_SCHEMA
|
||||
if _ACTIONS_SCHEMA is None:
|
||||
action_plugins = [x.name for x in
|
||||
plugins_base.rule_actions_manager()]
|
||||
_ACTIONS_SCHEMA = {
|
||||
"title": "Inspector rule actions schema",
|
||||
"type": "array",
|
||||
"minItems": 1,
|
||||
"items": {
|
||||
"type": "object",
|
||||
"required": ["action"],
|
||||
"properties": {
|
||||
"action": {
|
||||
"description": "action to take",
|
||||
"enum": action_plugins
|
||||
},
|
||||
},
|
||||
# other properties are validated by plugins
|
||||
"additionalProperties": True
|
||||
}
|
||||
}
|
||||
|
||||
return _ACTIONS_SCHEMA
|
||||
|
||||
|
||||
class IntrospectionRule(object):
|
||||
"""High-level class representing an introspection rule."""
|
||||
|
||||
def __init__(self, uuid, conditions, actions, description):
|
||||
"""Create rule object from database data."""
|
||||
self._uuid = uuid
|
||||
self._conditions = conditions
|
||||
self._actions = actions
|
||||
self._description = description
|
||||
|
||||
def as_dict(self, short=False):
|
||||
result = {
|
||||
'uuid': self._uuid,
|
||||
'description': self._description,
|
||||
}
|
||||
|
||||
if not short:
|
||||
result['conditions'] = [c.as_dict() for c in self._conditions]
|
||||
result['actions'] = [a.as_dict() for a in self._actions]
|
||||
|
||||
return result
|
||||
|
||||
@property
|
||||
def description(self):
|
||||
return self._description or self._uuid
|
||||
|
||||
def check_conditions(self, node_info, data):
|
||||
"""Check if conditions are true for a given node.
|
||||
|
||||
:param node_info: a NodeInfo object
|
||||
:param data: introspection data
|
||||
:returns: True if conditions match, otherwise False
|
||||
"""
|
||||
LOG.debug('Checking rule "%(descr)s" on node %(uuid)s',
|
||||
{'descr': self.description, 'uuid': node_info.uuid})
|
||||
ext_mgr = plugins_base.rule_conditions_manager()
|
||||
for cond in self._conditions:
|
||||
field_values = jsonpath.parse(cond.field).find(data)
|
||||
field_values = [x.value for x in field_values]
|
||||
cond_ext = ext_mgr[cond.op].obj
|
||||
|
||||
if not field_values:
|
||||
if cond_ext.ALLOW_NONE:
|
||||
LOG.debug('Field with JSON path %(path)s was not found in '
|
||||
'data for node %(uuid)s',
|
||||
{'path': cond.field, 'uuid': node_info.uuid})
|
||||
field_values = [None]
|
||||
else:
|
||||
LOG.info(_LI('Field with JSON path %(path)s was not found '
|
||||
'in data for node %(uuid)s, rule "%(rule)s" '
|
||||
'will not be applied'),
|
||||
{'path': cond.field, 'uuid': node_info.uuid,
|
||||
'rule': self.description})
|
||||
return False
|
||||
|
||||
for value in field_values:
|
||||
result = cond_ext.check(node_info, value, cond.params)
|
||||
if (cond.multiple == 'first'
|
||||
or (cond.multiple == 'all' and not result)
|
||||
or (cond.multiple == 'any' and result)):
|
||||
break
|
||||
|
||||
if not result:
|
||||
LOG.info(_LI('Rule "%(rule)s" will not be applied to node '
|
||||
'%(uuid)s: condition %(field)s %(op)s %(params)s '
|
||||
'failed'),
|
||||
{'rule': self.description, 'uuid': node_info.uuid,
|
||||
'field': cond.field, 'op': cond.op,
|
||||
'params': cond.params})
|
||||
return False
|
||||
|
||||
LOG.info(_LI('Rule "%(rule)s" will be applied to node %(uuid)s'),
|
||||
{'rule': self.description, 'uuid': node_info.uuid})
|
||||
return True
|
||||
|
||||
def apply_actions(self, node_info, rollback=False):
|
||||
"""Run actions on a node.
|
||||
|
||||
:param node_info: NodeInfo instance
|
||||
:param rollback: if True, rollback actions are executed
|
||||
"""
|
||||
if rollback:
|
||||
method = 'rollback'
|
||||
else:
|
||||
method = 'apply'
|
||||
|
||||
LOG.debug('Running %(what)s actions for rule "%(rule)s" '
|
||||
'on node %(node)s',
|
||||
{'what': method, 'rule': self.description,
|
||||
'node': node_info.uuid})
|
||||
|
||||
ext_mgr = plugins_base.rule_actions_manager()
|
||||
for act in self._actions:
|
||||
LOG.debug('Running %(what)s action `%(action)s %(params)s` for '
|
||||
'node %(node)s',
|
||||
{'action': act.action, 'params': act.params,
|
||||
'node': node_info.uuid, 'what': method})
|
||||
ext = ext_mgr[act.action].obj
|
||||
getattr(ext, method)(node_info, act.params)
|
||||
|
||||
LOG.debug('Successfully applied %(what)s to node %(node)s',
|
||||
{'what': 'rollback actions' if rollback else 'actions',
|
||||
'node': node_info.uuid})
|
||||
|
||||
|
||||
def create(conditions_json, actions_json, uuid=None,
|
||||
description=None):
|
||||
"""Create a new rule in database.
|
||||
|
||||
:param conditions_json: list of dicts with the following keys:
|
||||
* op - operator
|
||||
* field - JSON path to field to compare
|
||||
Other keys are stored as is.
|
||||
:param actions_json: list of dicts with the following keys:
|
||||
* action - action type
|
||||
Other keys are stored as is.
|
||||
:param uuid: rule UUID, will be generated if empty
|
||||
:param description: human-readable rule description
|
||||
:returns: new IntrospectionRule object
|
||||
:raises: utils.Error on failure
|
||||
"""
|
||||
uuid = uuid or uuidutils.generate_uuid()
|
||||
LOG.debug('Creating rule %(uuid)s with description "%(descr)s", '
|
||||
'conditions %(conditions)s and actions %(actions)s',
|
||||
{'uuid': uuid, 'descr': description,
|
||||
'conditions': conditions_json, 'actions': actions_json})
|
||||
|
||||
try:
|
||||
jsonschema.validate(conditions_json, conditions_schema())
|
||||
except jsonschema.ValidationError as exc:
|
||||
raise utils.Error(_('Validation failed for conditions: %s') % exc)
|
||||
|
||||
try:
|
||||
jsonschema.validate(actions_json, actions_schema())
|
||||
except jsonschema.ValidationError as exc:
|
||||
raise utils.Error(_('Validation failed for actions: %s') % exc)
|
||||
|
||||
cond_mgr = plugins_base.rule_conditions_manager()
|
||||
act_mgr = plugins_base.rule_actions_manager()
|
||||
|
||||
conditions = []
|
||||
for cond_json in conditions_json:
|
||||
field = cond_json['field']
|
||||
try:
|
||||
jsonpath.parse(field)
|
||||
except Exception as exc:
|
||||
raise utils.Error(_('Unable to parse field JSON path %(field)s: '
|
||||
'%(error)s') % {'field': field, 'error': exc})
|
||||
|
||||
plugin = cond_mgr[cond_json['op']].obj
|
||||
params = {k: v for k, v in cond_json.items()
|
||||
if k not in ('op', 'field', 'multiple')}
|
||||
try:
|
||||
plugin.validate(params)
|
||||
except ValueError as exc:
|
||||
raise utils.Error(_('Invalid parameters for operator %(op)s: '
|
||||
'%(error)s') %
|
||||
{'op': cond_json['op'], 'error': exc})
|
||||
|
||||
conditions.append((cond_json['field'], cond_json['op'],
|
||||
cond_json.get('multiple', 'any'), params))
|
||||
|
||||
actions = []
|
||||
for action_json in actions_json:
|
||||
plugin = act_mgr[action_json['action']].obj
|
||||
params = {k: v for k, v in action_json.items() if k != 'action'}
|
||||
try:
|
||||
plugin.validate(params)
|
||||
except ValueError as exc:
|
||||
raise utils.Error(_('Invalid parameters for action %(act)s: '
|
||||
'%(error)s') %
|
||||
{'act': action_json['action'], 'error': exc})
|
||||
|
||||
actions.append((action_json['action'], params))
|
||||
|
||||
try:
|
||||
with db.ensure_transaction() as session:
|
||||
rule = db.Rule(uuid=uuid, description=description,
|
||||
disabled=False, created_at=timeutils.utcnow())
|
||||
|
||||
for field, op, multiple, params in conditions:
|
||||
rule.conditions.append(db.RuleCondition(op=op, field=field,
|
||||
multiple=multiple,
|
||||
params=params))
|
||||
|
||||
for action, params in actions:
|
||||
rule.actions.append(db.RuleAction(action=action,
|
||||
params=params))
|
||||
|
||||
rule.save(session)
|
||||
except db_exc.DBDuplicateEntry as exc:
|
||||
LOG.error(_LE('Database integrity error %s when '
|
||||
'creating a rule'), exc)
|
||||
raise utils.Error(_('Rule with UUID %s already exists') % uuid,
|
||||
code=409)
|
||||
|
||||
LOG.info(_LI('Created rule %(uuid)s with description "%(descr)s"'),
|
||||
{'uuid': uuid, 'descr': description})
|
||||
return IntrospectionRule(uuid=uuid,
|
||||
conditions=rule.conditions,
|
||||
actions=rule.actions,
|
||||
description=description)
|
||||
|
||||
|
||||
def get(uuid):
|
||||
"""Get a rule by its UUID."""
|
||||
try:
|
||||
rule = db.model_query(db.Rule).filter_by(uuid=uuid).one()
|
||||
except orm.exc.NoResultFound:
|
||||
raise utils.Error(_('Rule %s was not found') % uuid, code=404)
|
||||
|
||||
return IntrospectionRule(uuid=rule.uuid, actions=rule.actions,
|
||||
conditions=rule.conditions,
|
||||
description=rule.description)
|
||||
|
||||
|
||||
def get_all():
|
||||
"""List all rules."""
|
||||
query = db.model_query(db.Rule).order_by(db.Rule.created_at)
|
||||
return [IntrospectionRule(uuid=rule.uuid, actions=rule.actions,
|
||||
conditions=rule.conditions,
|
||||
description=rule.description)
|
||||
for rule in query]
|
||||
|
||||
|
||||
def delete(uuid):
|
||||
"""Delete a rule by its UUID."""
|
||||
with db.ensure_transaction() as session:
|
||||
db.model_query(db.RuleAction,
|
||||
session=session).filter_by(rule=uuid).delete()
|
||||
db.model_query(db.RuleCondition,
|
||||
session=session) .filter_by(rule=uuid).delete()
|
||||
count = (db.model_query(db.Rule, session=session)
|
||||
.filter_by(uuid=uuid).delete())
|
||||
if not count:
|
||||
raise utils.Error(_('Rule %s was not found') % uuid, code=404)
|
||||
|
||||
LOG.info(_LI('Introspection rule %s was deleted'), uuid)
|
||||
|
||||
|
||||
def delete_all():
|
||||
"""Delete all rules."""
|
||||
with db.ensure_transaction() as session:
|
||||
db.model_query(db.RuleAction, session=session).delete()
|
||||
db.model_query(db.RuleCondition, session=session).delete()
|
||||
db.model_query(db.Rule, session=session).delete()
|
||||
|
||||
LOG.info(_LI('All introspection rules were deleted'))
|
||||
|
||||
|
||||
def apply(node_info, data):
|
||||
"""Apply rules to a node."""
|
||||
rules = get_all()
|
||||
if not rules:
|
||||
LOG.debug('No custom introspection rules to apply to node %s',
|
||||
node_info.uuid)
|
||||
return
|
||||
|
||||
LOG.debug('Applying custom introspection rules to node %s', node_info.uuid)
|
||||
|
||||
to_rollback = []
|
||||
to_apply = []
|
||||
for rule in rules:
|
||||
if rule.check_conditions(node_info, data):
|
||||
to_apply.append(rule)
|
||||
else:
|
||||
to_rollback.append(rule)
|
||||
|
||||
if to_rollback:
|
||||
LOG.debug('Running rollback actions on node %s', node_info.uuid)
|
||||
for rule in to_rollback:
|
||||
rule.apply_actions(node_info, rollback=True)
|
||||
else:
|
||||
LOG.debug('No rollback actions to apply on node %s', node_info.uuid)
|
||||
|
||||
if to_apply:
|
||||
LOG.debug('Running actions on node %s', node_info.uuid)
|
||||
for rule in to_apply:
|
||||
rule.apply_actions(node_info, rollback=False)
|
||||
else:
|
||||
LOG.debug('No actions to apply on node %s', node_info.uuid)
|
||||
|
||||
LOG.info(_LI('Successfully applied custom introspection rules to node %s'),
|
||||
node_info.uuid)
|
@ -25,6 +25,7 @@ import mock
|
||||
import requests
|
||||
|
||||
from ironic_inspector import main
|
||||
from ironic_inspector import rules
|
||||
from ironic_inspector.test import base
|
||||
from ironic_inspector import utils
|
||||
|
||||
@ -50,8 +51,11 @@ DEFAULT_SLEEP = 2
|
||||
|
||||
|
||||
class Base(base.NodeTest):
|
||||
ROOT_URL = 'http://127.0.0.1:5050'
|
||||
|
||||
def setUp(self):
|
||||
super(Base, self).setUp()
|
||||
rules.delete_all()
|
||||
|
||||
self.cli = utils.get_client()
|
||||
self.cli.reset_mock()
|
||||
@ -82,17 +86,19 @@ class Base(base.NodeTest):
|
||||
|
||||
self.node.power_state = 'power off'
|
||||
|
||||
def call(self, method, endpoint, data=None, expect_errors=False,
|
||||
def call(self, method, endpoint, data=None, expect_error=None,
|
||||
api_version=None):
|
||||
if data is not None:
|
||||
data = json.dumps(data)
|
||||
endpoint = 'http://127.0.0.1:5050' + endpoint
|
||||
endpoint = self.ROOT_URL + endpoint
|
||||
headers = {'X-Auth-Token': 'token'}
|
||||
if api_version:
|
||||
headers[main._VERSION_HEADER] = '%d.%d' % api_version
|
||||
res = getattr(requests, method.lower())(endpoint, data=data,
|
||||
headers=headers)
|
||||
if not expect_errors:
|
||||
if expect_error:
|
||||
self.assertEqual(expect_error, res.status_code)
|
||||
else:
|
||||
res.raise_for_status()
|
||||
return res
|
||||
|
||||
@ -111,6 +117,21 @@ class Base(base.NodeTest):
|
||||
def call_continue(self, data):
|
||||
return self.call('post', '/v1/continue', data=data).json()
|
||||
|
||||
def call_add_rule(self, data):
|
||||
return self.call('post', '/v1/rules', data=data).json()
|
||||
|
||||
def call_list_rules(self):
|
||||
return self.call('get', '/v1/rules').json()['rules']
|
||||
|
||||
def call_delete_rules(self):
|
||||
self.call('delete', '/v1/rules')
|
||||
|
||||
def call_delete_rule(self, uuid):
|
||||
self.call('delete', '/v1/rules/' + uuid)
|
||||
|
||||
def call_get_rule(self, uuid):
|
||||
return self.call('get', '/v1/rules/' + uuid).json()
|
||||
|
||||
|
||||
class Test(Base):
|
||||
def test_bmc(self):
|
||||
@ -163,6 +184,92 @@ class Test(Base):
|
||||
status = self.call_get_status(self.uuid)
|
||||
self.assertEqual({'finished': True, 'error': None}, status)
|
||||
|
||||
def test_rules_api(self):
|
||||
res = self.call_list_rules()
|
||||
self.assertEqual([], res)
|
||||
|
||||
rule = {'conditions': [],
|
||||
'actions': [{'action': 'fail', 'message': 'boom'}],
|
||||
'description': 'Cool actions'}
|
||||
res = self.call_add_rule(rule)
|
||||
self.assertTrue(res['uuid'])
|
||||
rule['uuid'] = res['uuid']
|
||||
rule['links'] = res['links']
|
||||
self.assertEqual(rule, res)
|
||||
|
||||
res = self.call('get', rule['links'][0]['href']).json()
|
||||
self.assertEqual(rule, res)
|
||||
|
||||
res = self.call_list_rules()
|
||||
self.assertEqual(rule['links'], res[0].pop('links'))
|
||||
self.assertEqual([{'uuid': rule['uuid'],
|
||||
'description': 'Cool actions'}],
|
||||
res)
|
||||
|
||||
res = self.call_get_rule(rule['uuid'])
|
||||
self.assertEqual(rule, res)
|
||||
|
||||
self.call_delete_rule(rule['uuid'])
|
||||
res = self.call_list_rules()
|
||||
self.assertEqual([], res)
|
||||
|
||||
links = rule.pop('links')
|
||||
del rule['uuid']
|
||||
for _ in range(3):
|
||||
self.call_add_rule(rule)
|
||||
|
||||
res = self.call_list_rules()
|
||||
self.assertEqual(3, len(res))
|
||||
|
||||
self.call_delete_rules()
|
||||
res = self.call_list_rules()
|
||||
self.assertEqual([], res)
|
||||
|
||||
self.call('get', links[0]['href'], expect_error=404)
|
||||
self.call('delete', links[0]['href'], expect_error=404)
|
||||
|
||||
def test_introspection_rules(self):
|
||||
self.node.extra['bar'] = 'foo'
|
||||
rules = [
|
||||
{
|
||||
'conditions': [
|
||||
{'field': 'memory_mb', 'op': 'eq', 'value': 12288},
|
||||
{'field': 'local_gb', 'op': 'gt', 'value': 400},
|
||||
{'field': 'local_gb', 'op': 'lt', 'value': 500},
|
||||
],
|
||||
'actions': [
|
||||
{'action': 'set-attribute', 'path': '/extra/foo',
|
||||
'value': 'bar'}
|
||||
]
|
||||
},
|
||||
{
|
||||
'conditions': [
|
||||
{'field': 'memory_mb', 'op': 'ge', 'value': 100500},
|
||||
],
|
||||
'actions': [
|
||||
{'action': 'set-attribute', 'path': '/extra/bar',
|
||||
'value': 'foo'},
|
||||
{'action': 'fail', 'message': 'boom'}
|
||||
]
|
||||
}
|
||||
]
|
||||
for rule in rules:
|
||||
self.call_add_rule(rule)
|
||||
|
||||
self.call_introspect(self.uuid)
|
||||
eventlet.greenthread.sleep(DEFAULT_SLEEP)
|
||||
self.call_continue(self.data)
|
||||
eventlet.greenthread.sleep(DEFAULT_SLEEP)
|
||||
|
||||
# clean up for second rule
|
||||
self.cli.node.update.assert_any_call(
|
||||
self.uuid,
|
||||
[{'op': 'remove', 'path': '/extra/bar'}])
|
||||
# applying first rule
|
||||
self.cli.node.update.assert_any_call(
|
||||
self.uuid,
|
||||
[{'op': 'add', 'path': '/extra/foo', 'value': 'bar'}])
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def mocked_server():
|
||||
|
@ -27,6 +27,7 @@ from ironic_inspector import node_cache
|
||||
from ironic_inspector.plugins import base as plugins_base
|
||||
from ironic_inspector.plugins import example as example_plugin
|
||||
from ironic_inspector import process
|
||||
from ironic_inspector import rules
|
||||
from ironic_inspector.test import base as test_base
|
||||
from ironic_inspector import utils
|
||||
from oslo_config import cfg
|
||||
@ -113,11 +114,11 @@ class TestApiContinue(BaseAPITest):
|
||||
def test_continue(self, process_mock):
|
||||
# should be ignored
|
||||
CONF.set_override('auth_strategy', 'keystone')
|
||||
process_mock.return_value = [42]
|
||||
process_mock.return_value = {'result': 42}
|
||||
res = self.app.post('/v1/continue', data='"JSON"')
|
||||
self.assertEqual(200, res.status_code)
|
||||
process_mock.assert_called_once_with("JSON")
|
||||
self.assertEqual(b'[42]', res.data)
|
||||
self.assertEqual({"result": 42}, json.loads(res.data.decode()))
|
||||
|
||||
@mock.patch.object(process, 'process', autospec=True)
|
||||
def test_continue_failed(self, process_mock):
|
||||
@ -181,6 +182,90 @@ class TestApiGetData(BaseAPITest):
|
||||
self.assertEqual(404, res.status_code)
|
||||
|
||||
|
||||
class TestApiRules(BaseAPITest):
|
||||
@mock.patch.object(rules, 'get_all')
|
||||
def test_get_all(self, get_all_mock):
|
||||
get_all_mock.return_value = [
|
||||
mock.Mock(spec=rules.IntrospectionRule,
|
||||
**{'as_dict.return_value': {'uuid': 'foo'}}),
|
||||
mock.Mock(spec=rules.IntrospectionRule,
|
||||
**{'as_dict.return_value': {'uuid': 'bar'}}),
|
||||
]
|
||||
|
||||
res = self.app.get('/v1/rules')
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertEqual(
|
||||
{
|
||||
'rules': [{'uuid': 'foo',
|
||||
'links': [
|
||||
{'href': '/v1/rules/foo', 'rel': 'self'}
|
||||
]},
|
||||
{'uuid': 'bar',
|
||||
'links': [
|
||||
{'href': '/v1/rules/bar', 'rel': 'self'}
|
||||
]}]
|
||||
},
|
||||
json.loads(res.data.decode('utf-8')))
|
||||
get_all_mock.assert_called_once_with()
|
||||
for m in get_all_mock.return_value:
|
||||
m.as_dict.assert_called_with(short=True)
|
||||
|
||||
@mock.patch.object(rules, 'delete_all')
|
||||
def test_delete_all(self, delete_all_mock):
|
||||
res = self.app.delete('/v1/rules')
|
||||
self.assertEqual(204, res.status_code)
|
||||
delete_all_mock.assert_called_once_with()
|
||||
|
||||
@mock.patch.object(rules, 'create', autospec=True)
|
||||
def test_create(self, create_mock):
|
||||
data = {'uuid': self.uuid,
|
||||
'conditions': 'cond',
|
||||
'actions': 'act'}
|
||||
exp = data.copy()
|
||||
exp['description'] = None
|
||||
create_mock.return_value = mock.Mock(spec=rules.IntrospectionRule,
|
||||
**{'as_dict.return_value': exp})
|
||||
|
||||
res = self.app.post('/v1/rules', data=json.dumps(data))
|
||||
self.assertEqual(200, res.status_code)
|
||||
create_mock.assert_called_once_with(conditions_json='cond',
|
||||
actions_json='act',
|
||||
uuid=self.uuid,
|
||||
description=None)
|
||||
self.assertEqual(exp, json.loads(res.data.decode('utf-8')))
|
||||
|
||||
@mock.patch.object(rules, 'create', autospec=True)
|
||||
def test_create_bad_uuid(self, create_mock):
|
||||
data = {'uuid': 'foo',
|
||||
'conditions': 'cond',
|
||||
'actions': 'act'}
|
||||
|
||||
res = self.app.post('/v1/rules', data=json.dumps(data))
|
||||
self.assertEqual(400, res.status_code)
|
||||
|
||||
@mock.patch.object(rules, 'get')
|
||||
def test_get_one(self, get_mock):
|
||||
get_mock.return_value = mock.Mock(spec=rules.IntrospectionRule,
|
||||
**{'as_dict.return_value':
|
||||
{'uuid': 'foo'}})
|
||||
|
||||
res = self.app.get('/v1/rules/' + self.uuid)
|
||||
self.assertEqual(200, res.status_code)
|
||||
self.assertEqual({'uuid': 'foo',
|
||||
'links': [
|
||||
{'href': '/v1/rules/foo', 'rel': 'self'}
|
||||
]},
|
||||
json.loads(res.data.decode('utf-8')))
|
||||
get_mock.assert_called_once_with(self.uuid)
|
||||
get_mock.return_value.as_dict.assert_called_once_with(short=False)
|
||||
|
||||
@mock.patch.object(rules, 'delete')
|
||||
def test_delete_one(self, delete_mock):
|
||||
res = self.app.delete('/v1/rules/' + self.uuid)
|
||||
self.assertEqual(204, res.status_code)
|
||||
delete_mock.assert_called_once_with(self.uuid)
|
||||
|
||||
|
||||
class TestApiMisc(BaseAPITest):
|
||||
@mock.patch.object(node_cache, 'get_node', autospec=True)
|
||||
def test_404_expected(self, get_mock):
|
||||
|
@ -524,3 +524,20 @@ class TestUpdate(test_base.NodeTest):
|
||||
|
||||
self.ironic.port.delete.assert_called_once_with('0')
|
||||
self.assertEqual(['mac1'], list(self.node_info.ports()))
|
||||
|
||||
|
||||
class TestNodeCacheGetByPath(test_base.NodeTest):
|
||||
def setUp(self):
|
||||
super(TestNodeCacheGetByPath, self).setUp()
|
||||
self.node = mock.Mock(spec=['uuid', 'properties'],
|
||||
properties={'answer': 42},
|
||||
uuid=self.uuid)
|
||||
self.node_info = node_cache.NodeInfo(uuid=self.uuid, started_at=0,
|
||||
node=self.node)
|
||||
|
||||
def test_get_by_path(self):
|
||||
self.assertEqual(self.uuid, self.node_info.get_by_path('/uuid'))
|
||||
self.assertEqual(self.uuid, self.node_info.get_by_path('uuid'))
|
||||
self.assertEqual(42, self.node_info.get_by_path('/properties/answer'))
|
||||
self.assertRaises(KeyError, self.node_info.get_by_path, '/foo')
|
||||
self.assertRaises(KeyError, self.node_info.get_by_path, '/extra/foo')
|
||||
|
44
ironic_inspector/test/test_plugins_base.py
Normal file
44
ironic_inspector/test/test_plugins_base.py
Normal file
@ -0,0 +1,44 @@
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
from ironic_inspector.plugins import base
|
||||
from ironic_inspector.test import base as test_base
|
||||
|
||||
|
||||
class WithValidation(base.WithValidation):
|
||||
REQUIRED_PARAMS = {'x'}
|
||||
OPTIONAL_PARAMS = {'y', 'z'}
|
||||
|
||||
|
||||
class TestWithValidation(test_base.BaseTest):
|
||||
def setUp(self):
|
||||
super(TestWithValidation, self).setUp()
|
||||
self.test = WithValidation()
|
||||
|
||||
def test_ok(self):
|
||||
for x in (1, 0, '', False, True):
|
||||
self.test.validate({'x': x})
|
||||
self.test.validate({'x': 'x', 'y': 42})
|
||||
self.test.validate({'x': 'x', 'y': 42, 'z': False})
|
||||
|
||||
def test_required_missing(self):
|
||||
err_re = 'missing required parameter\(s\): x'
|
||||
self.assertRaisesRegexp(ValueError, err_re, self.test.validate, {})
|
||||
self.assertRaisesRegexp(ValueError, err_re, self.test.validate,
|
||||
{'x': None})
|
||||
self.assertRaisesRegexp(ValueError, err_re, self.test.validate,
|
||||
{'y': 1, 'z': 2})
|
||||
|
||||
def test_unexpected(self):
|
||||
self.assertRaisesRegexp(ValueError, 'unexpected parameter\(s\): foo',
|
||||
self.test.validate, {'foo': 'bar', 'x': 42})
|
116
ironic_inspector/test/test_plugins_rules.py
Normal file
116
ironic_inspector/test/test_plugins_rules.py
Normal file
@ -0,0 +1,116 @@
|
||||
# Copyright 2015 Red Hat, Inc.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Tests for introspection rules plugins."""
|
||||
|
||||
import mock
|
||||
|
||||
from ironic_inspector import node_cache
|
||||
from ironic_inspector.plugins import rules as rules_plugins
|
||||
from ironic_inspector.test import base as test_base
|
||||
from ironic_inspector import utils
|
||||
|
||||
|
||||
TEST_SET = [(42, 42), ('42', 42), ('4.2', 4.2),
|
||||
(42, 41), ('42', 41), ('4.2', 4.0),
|
||||
(41, 42), ('41', 42), ('4.0', 4.2)]
|
||||
|
||||
|
||||
class TestSimpleConditions(test_base.BaseTest):
|
||||
def test_validate(self):
|
||||
cond = rules_plugins.SimpleCondition()
|
||||
cond.validate({'value': 42})
|
||||
self.assertRaises(ValueError, cond.validate, {})
|
||||
|
||||
def _test(self, cond, expected, value, ref):
|
||||
self.assertIs(expected, cond.check(None, value, {'value': ref}))
|
||||
|
||||
def test_eq(self):
|
||||
cond = rules_plugins.EqCondition()
|
||||
for values, expected in zip(TEST_SET, [True] * 3 + [False] * 6):
|
||||
self._test(cond, expected, *values)
|
||||
self._test(cond, True, 'foo', 'foo')
|
||||
self._test(cond, False, 'foo', 'bar')
|
||||
|
||||
def test_ne(self):
|
||||
cond = rules_plugins.NeCondition()
|
||||
for values, expected in zip(TEST_SET, [False] * 3 + [True] * 6):
|
||||
self._test(cond, expected, *values)
|
||||
self._test(cond, False, 'foo', 'foo')
|
||||
self._test(cond, True, 'foo', 'bar')
|
||||
|
||||
def test_gt(self):
|
||||
cond = rules_plugins.GtCondition()
|
||||
for values, expected in zip(TEST_SET, [False] * 3 + [True] * 3
|
||||
+ [False] * 3):
|
||||
self._test(cond, expected, *values)
|
||||
|
||||
def test_ge(self):
|
||||
cond = rules_plugins.GeCondition()
|
||||
for values, expected in zip(TEST_SET, [True] * 6 + [False] * 3):
|
||||
self._test(cond, expected, *values)
|
||||
|
||||
def test_le(self):
|
||||
cond = rules_plugins.LeCondition()
|
||||
for values, expected in zip(TEST_SET, [True] * 3 + [False] * 3
|
||||
+ [True] * 3):
|
||||
self._test(cond, expected, *values)
|
||||
|
||||
def test_lt(self):
|
||||
cond = rules_plugins.LtCondition()
|
||||
for values, expected in zip(TEST_SET, [False] * 6 + [True] * 3):
|
||||
self._test(cond, expected, *values)
|
||||
|
||||
|
||||
class TestFailAction(test_base.BaseTest):
|
||||
act = rules_plugins.FailAction()
|
||||
|
||||
def test_validate(self):
|
||||
self.act.validate({'message': 'boom'})
|
||||
self.assertRaises(ValueError, self.act.validate, {})
|
||||
|
||||
def test_apply(self):
|
||||
self.assertRaisesRegexp(utils.Error, 'boom',
|
||||
self.act.apply, None, {'message': 'boom'})
|
||||
|
||||
|
||||
class TestSetAttributeAction(test_base.NodeTest):
|
||||
act = rules_plugins.SetAttributeAction()
|
||||
params = {'path': '/extra/value', 'value': 42}
|
||||
|
||||
def test_validate(self):
|
||||
self.act.validate(self.params)
|
||||
self.assertRaises(ValueError, self.act.validate, {'value': 42})
|
||||
self.assertRaises(ValueError, self.act.validate,
|
||||
{'path': '/extra/value'})
|
||||
|
||||
@mock.patch.object(node_cache.NodeInfo, 'patch')
|
||||
def test_apply(self, mock_patch):
|
||||
self.act.apply(self.node_info, self.params)
|
||||
mock_patch.assert_called_once_with([{'op': 'add',
|
||||
'path': '/extra/value',
|
||||
'value': 42}])
|
||||
|
||||
@mock.patch.object(node_cache.NodeInfo, 'patch')
|
||||
def test_rollback_with_existing(self, mock_patch):
|
||||
self.node.extra = {'value': 'value'}
|
||||
self.act.rollback(self.node_info, self.params)
|
||||
mock_patch.assert_called_once_with([{'op': 'remove',
|
||||
'path': '/extra/value'}])
|
||||
|
||||
@mock.patch.object(node_cache.NodeInfo, 'patch')
|
||||
def test_rollback_no_existing(self, mock_patch):
|
||||
self.node.extra = {}
|
||||
self.act.rollback(self.node_info, self.params)
|
||||
self.assertFalse(mock_patch.called)
|
420
ironic_inspector/test/test_rules.py
Normal file
420
ironic_inspector/test/test_rules.py
Normal file
@ -0,0 +1,420 @@
|
||||
# Copyright 2015 Red Hat, Inc.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Tests for introspection rules."""
|
||||
|
||||
import mock
|
||||
|
||||
from ironic_inspector import db
|
||||
from ironic_inspector import node_cache
|
||||
from ironic_inspector.plugins import base as plugins_base
|
||||
from ironic_inspector import rules
|
||||
from ironic_inspector.test import base as test_base
|
||||
from ironic_inspector import utils
|
||||
|
||||
|
||||
class BaseTest(test_base.BaseTest):
|
||||
def setUp(self):
|
||||
super(BaseTest, self).setUp()
|
||||
self.uuid = 'uuid'
|
||||
self.conditions_json = [
|
||||
{'op': 'eq', 'field': 'memory_mb', 'value': 1024},
|
||||
{'op': 'eq', 'field': 'local_gb', 'value': 60},
|
||||
]
|
||||
self.actions_json = [
|
||||
{'action': 'fail', 'message': 'boom!'}
|
||||
]
|
||||
|
||||
self.data = {
|
||||
'memory_mb': 1024,
|
||||
'local_gb': 42,
|
||||
}
|
||||
self.node_info = node_cache.NodeInfo(uuid=self.uuid, started_at=42)
|
||||
|
||||
|
||||
class TestCreateRule(BaseTest):
|
||||
def test_only_actions(self):
|
||||
rule = rules.create([], self.actions_json)
|
||||
rule_json = rule.as_dict()
|
||||
|
||||
self.assertTrue(rule_json.pop('uuid'))
|
||||
self.assertEqual({'description': None,
|
||||
'conditions': [],
|
||||
'actions': self.actions_json},
|
||||
rule_json)
|
||||
|
||||
def test_duplicate_uuid(self):
|
||||
rules.create([], self.actions_json, uuid=self.uuid)
|
||||
self.assertRaisesRegexp(utils.Error, 'already exists',
|
||||
rules.create, [], self.actions_json,
|
||||
uuid=self.uuid)
|
||||
|
||||
def test_with_conditions(self):
|
||||
rule = rules.create(self.conditions_json, self.actions_json)
|
||||
rule_json = rule.as_dict()
|
||||
|
||||
self.assertTrue(rule_json.pop('uuid'))
|
||||
self.assertEqual({'description': None,
|
||||
'conditions': self.conditions_json,
|
||||
'actions': self.actions_json},
|
||||
rule_json)
|
||||
|
||||
def test_invalid_condition(self):
|
||||
del self.conditions_json[0]['op']
|
||||
|
||||
self.assertRaisesRegexp(utils.Error,
|
||||
'Validation failed for conditions',
|
||||
rules.create,
|
||||
self.conditions_json, self.actions_json)
|
||||
|
||||
self.conditions_json[0]['op'] = 'foobar'
|
||||
|
||||
self.assertRaisesRegexp(utils.Error,
|
||||
'Validation failed for conditions',
|
||||
rules.create,
|
||||
self.conditions_json, self.actions_json)
|
||||
|
||||
def test_invalid_condition_field(self):
|
||||
self.conditions_json[0]['field'] = '!*!'
|
||||
|
||||
self.assertRaisesRegexp(utils.Error,
|
||||
'Unable to parse field JSON path',
|
||||
rules.create,
|
||||
self.conditions_json, self.actions_json)
|
||||
|
||||
def test_invalid_condition_parameters(self):
|
||||
self.conditions_json[0]['foo'] = 'bar'
|
||||
|
||||
self.assertRaisesRegexp(utils.Error,
|
||||
'Invalid parameters for operator',
|
||||
rules.create,
|
||||
self.conditions_json, self.actions_json)
|
||||
|
||||
def test_no_actions(self):
|
||||
self.assertRaisesRegexp(utils.Error,
|
||||
'Validation failed for actions',
|
||||
rules.create,
|
||||
self.conditions_json, [])
|
||||
|
||||
def test_invalid_action(self):
|
||||
del self.actions_json[0]['action']
|
||||
|
||||
self.assertRaisesRegexp(utils.Error,
|
||||
'Validation failed for actions',
|
||||
rules.create,
|
||||
self.conditions_json, self.actions_json)
|
||||
|
||||
self.actions_json[0]['action'] = 'foobar'
|
||||
|
||||
self.assertRaisesRegexp(utils.Error,
|
||||
'Validation failed for actions',
|
||||
rules.create,
|
||||
self.conditions_json, self.actions_json)
|
||||
|
||||
def test_invalid_action_parameters(self):
|
||||
self.actions_json[0]['foo'] = 'bar'
|
||||
|
||||
self.assertRaisesRegexp(utils.Error,
|
||||
'Invalid parameters for action',
|
||||
rules.create,
|
||||
self.conditions_json, self.actions_json)
|
||||
|
||||
|
||||
class TestGetRule(BaseTest):
|
||||
def setUp(self):
|
||||
super(TestGetRule, self).setUp()
|
||||
rules.create(self.conditions_json, self.actions_json, uuid=self.uuid)
|
||||
|
||||
def test_get(self):
|
||||
rule_json = rules.get(self.uuid).as_dict()
|
||||
|
||||
self.assertTrue(rule_json.pop(self.uuid))
|
||||
self.assertEqual({'description': None,
|
||||
'conditions': self.conditions_json,
|
||||
'actions': self.actions_json},
|
||||
rule_json)
|
||||
|
||||
def test_not_found(self):
|
||||
self.assertRaises(utils.Error, rules.get, 'foobar')
|
||||
|
||||
def test_get_all(self):
|
||||
rules.create(self.conditions_json, self.actions_json, uuid='uuid2')
|
||||
self.assertEqual([self.uuid, 'uuid2'],
|
||||
[r.as_dict()['uuid'] for r in rules.get_all()])
|
||||
|
||||
|
||||
class TestDeleteRule(BaseTest):
|
||||
def setUp(self):
|
||||
super(TestDeleteRule, self).setUp()
|
||||
self.uuid2 = self.uuid + '-2'
|
||||
rules.create(self.conditions_json, self.actions_json, uuid=self.uuid)
|
||||
rules.create(self.conditions_json, self.actions_json, uuid=self.uuid2)
|
||||
|
||||
def test_delete(self):
|
||||
rules.delete(self.uuid)
|
||||
|
||||
self.assertEqual([(self.uuid2,)], db.model_query(db.Rule.uuid).all())
|
||||
self.assertFalse(db.model_query(db.RuleCondition)
|
||||
.filter_by(rule=self.uuid).all())
|
||||
self.assertFalse(db.model_query(db.RuleAction)
|
||||
.filter_by(rule=self.uuid).all())
|
||||
|
||||
def test_delete_non_existing(self):
|
||||
self.assertRaises(utils.Error, rules.delete, 'foo')
|
||||
|
||||
def test_delete_all(self):
|
||||
rules.delete_all()
|
||||
|
||||
self.assertFalse(db.model_query(db.Rule).all())
|
||||
self.assertFalse(db.model_query(db.RuleCondition).all())
|
||||
self.assertFalse(db.model_query(db.RuleAction).all())
|
||||
|
||||
|
||||
@mock.patch.object(plugins_base, 'rule_conditions_manager', autospec=True)
|
||||
class TestCheckConditions(BaseTest):
|
||||
def setUp(self):
|
||||
super(TestCheckConditions, self).setUp()
|
||||
|
||||
self.rule = rules.create(conditions_json=self.conditions_json,
|
||||
actions_json=self.actions_json)
|
||||
self.cond_mock = mock.Mock(spec=plugins_base.RuleConditionPlugin)
|
||||
self.cond_mock.ALLOW_NONE = False
|
||||
self.ext_mock = mock.Mock(spec=['obj'], obj=self.cond_mock)
|
||||
|
||||
def test_ok(self, mock_ext_mgr):
|
||||
mock_ext_mgr.return_value.__getitem__.return_value = self.ext_mock
|
||||
self.cond_mock.check.return_value = True
|
||||
|
||||
res = self.rule.check_conditions(self.node_info, self.data)
|
||||
|
||||
self.cond_mock.check.assert_any_call(self.node_info, 1024,
|
||||
{'value': 1024})
|
||||
self.cond_mock.check.assert_any_call(self.node_info, 42,
|
||||
{'value': 60})
|
||||
self.assertEqual(len(self.conditions_json),
|
||||
self.cond_mock.check.call_count)
|
||||
self.assertTrue(res)
|
||||
|
||||
def test_no_field(self, mock_ext_mgr):
|
||||
mock_ext_mgr.return_value.__getitem__.return_value = self.ext_mock
|
||||
self.cond_mock.check.return_value = True
|
||||
del self.data['local_gb']
|
||||
|
||||
res = self.rule.check_conditions(self.node_info, self.data)
|
||||
|
||||
self.cond_mock.check.assert_called_once_with(self.node_info, 1024,
|
||||
{'value': 1024})
|
||||
self.assertFalse(res)
|
||||
|
||||
def test_no_field_none_allowed(self, mock_ext_mgr):
|
||||
mock_ext_mgr.return_value.__getitem__.return_value = self.ext_mock
|
||||
self.cond_mock.ALLOW_NONE = True
|
||||
self.cond_mock.check.return_value = True
|
||||
del self.data['local_gb']
|
||||
|
||||
res = self.rule.check_conditions(self.node_info, self.data)
|
||||
|
||||
self.cond_mock.check.assert_any_call(self.node_info, 1024,
|
||||
{'value': 1024})
|
||||
self.cond_mock.check.assert_any_call(self.node_info, None,
|
||||
{'value': 60})
|
||||
self.assertEqual(len(self.conditions_json),
|
||||
self.cond_mock.check.call_count)
|
||||
self.assertTrue(res)
|
||||
|
||||
def test_fail(self, mock_ext_mgr):
|
||||
mock_ext_mgr.return_value.__getitem__.return_value = self.ext_mock
|
||||
self.cond_mock.check.return_value = False
|
||||
|
||||
res = self.rule.check_conditions(self.node_info, self.data)
|
||||
|
||||
self.cond_mock.check.assert_called_once_with(self.node_info, 1024,
|
||||
{'value': 1024})
|
||||
self.assertFalse(res)
|
||||
|
||||
|
||||
class TestCheckConditionsMultiple(BaseTest):
|
||||
def setUp(self):
|
||||
super(TestCheckConditionsMultiple, self).setUp()
|
||||
|
||||
self.conditions_json = [
|
||||
{'op': 'eq', 'field': 'interfaces[*].ip', 'value': '1.2.3.4'}
|
||||
]
|
||||
|
||||
def _build_data(self, ips):
|
||||
return {
|
||||
'interfaces': [
|
||||
{'ip': ip} for ip in ips
|
||||
]
|
||||
}
|
||||
|
||||
def test_default(self):
|
||||
rule = rules.create(conditions_json=self.conditions_json,
|
||||
actions_json=self.actions_json)
|
||||
data_set = [
|
||||
(['1.1.1.1', '1.2.3.4', '1.3.2.2'], True),
|
||||
(['1.2.3.4'], True),
|
||||
(['1.1.1.1', '1.3.2.2'], False),
|
||||
(['1.2.3.4', '1.3.2.2'], True),
|
||||
]
|
||||
for ips, result in data_set:
|
||||
data = self._build_data(ips)
|
||||
self.assertIs(result, rule.check_conditions(self.node_info, data),
|
||||
data)
|
||||
|
||||
def test_any(self):
|
||||
self.conditions_json[0]['multiple'] = 'any'
|
||||
rule = rules.create(conditions_json=self.conditions_json,
|
||||
actions_json=self.actions_json)
|
||||
data_set = [
|
||||
(['1.1.1.1', '1.2.3.4', '1.3.2.2'], True),
|
||||
(['1.2.3.4'], True),
|
||||
(['1.1.1.1', '1.3.2.2'], False),
|
||||
(['1.2.3.4', '1.3.2.2'], True),
|
||||
]
|
||||
for ips, result in data_set:
|
||||
data = self._build_data(ips)
|
||||
self.assertIs(result, rule.check_conditions(self.node_info, data),
|
||||
data)
|
||||
|
||||
def test_all(self):
|
||||
self.conditions_json[0]['multiple'] = 'all'
|
||||
rule = rules.create(conditions_json=self.conditions_json,
|
||||
actions_json=self.actions_json)
|
||||
data_set = [
|
||||
(['1.1.1.1', '1.2.3.4', '1.3.2.2'], False),
|
||||
(['1.2.3.4'], True),
|
||||
(['1.1.1.1', '1.3.2.2'], False),
|
||||
(['1.2.3.4', '1.3.2.2'], False),
|
||||
]
|
||||
for ips, result in data_set:
|
||||
data = self._build_data(ips)
|
||||
self.assertIs(result, rule.check_conditions(self.node_info, data),
|
||||
data)
|
||||
|
||||
def test_first(self):
|
||||
self.conditions_json[0]['multiple'] = 'first'
|
||||
rule = rules.create(conditions_json=self.conditions_json,
|
||||
actions_json=self.actions_json)
|
||||
data_set = [
|
||||
(['1.1.1.1', '1.2.3.4', '1.3.2.2'], False),
|
||||
(['1.2.3.4'], True),
|
||||
(['1.1.1.1', '1.3.2.2'], False),
|
||||
(['1.2.3.4', '1.3.2.2'], True),
|
||||
]
|
||||
for ips, result in data_set:
|
||||
data = self._build_data(ips)
|
||||
self.assertIs(result, rule.check_conditions(self.node_info, data),
|
||||
data)
|
||||
|
||||
|
||||
@mock.patch.object(plugins_base, 'rule_actions_manager', autospec=True)
|
||||
class TestApplyActions(BaseTest):
|
||||
def setUp(self):
|
||||
super(TestApplyActions, self).setUp()
|
||||
self.actions_json.append({'action': 'example'})
|
||||
|
||||
self.rule = rules.create(conditions_json=self.conditions_json,
|
||||
actions_json=self.actions_json)
|
||||
self.act_mock = mock.Mock(spec=plugins_base.RuleActionPlugin)
|
||||
self.ext_mock = mock.Mock(spec=['obj'], obj=self.act_mock)
|
||||
|
||||
def test_apply(self, mock_ext_mgr):
|
||||
mock_ext_mgr.return_value.__getitem__.return_value = self.ext_mock
|
||||
|
||||
self.rule.apply_actions(self.node_info)
|
||||
|
||||
self.act_mock.apply.assert_any_call(self.node_info,
|
||||
{'message': 'boom!'})
|
||||
self.act_mock.apply.assert_any_call(self.node_info, {})
|
||||
self.assertEqual(len(self.actions_json),
|
||||
self.act_mock.apply.call_count)
|
||||
self.assertFalse(self.act_mock.rollback.called)
|
||||
|
||||
def test_rollback(self, mock_ext_mgr):
|
||||
mock_ext_mgr.return_value.__getitem__.return_value = self.ext_mock
|
||||
|
||||
self.rule.apply_actions(self.node_info, rollback=True)
|
||||
|
||||
self.act_mock.rollback.assert_any_call(self.node_info,
|
||||
{'message': 'boom!'})
|
||||
self.act_mock.rollback.assert_any_call(self.node_info, {})
|
||||
self.assertEqual(len(self.actions_json),
|
||||
self.act_mock.rollback.call_count)
|
||||
self.assertFalse(self.act_mock.apply.called)
|
||||
|
||||
|
||||
@mock.patch.object(rules, 'get_all', autospec=True)
|
||||
class TestApply(BaseTest):
|
||||
def setUp(self):
|
||||
super(TestApply, self).setUp()
|
||||
self.rules = [mock.Mock(spec=rules.IntrospectionRule),
|
||||
mock.Mock(spec=rules.IntrospectionRule)]
|
||||
|
||||
def test_no_rules(self, mock_get_all):
|
||||
mock_get_all.return_value = []
|
||||
|
||||
rules.apply(self.node_info, self.data)
|
||||
|
||||
def test_no_actions(self, mock_get_all):
|
||||
mock_get_all.return_value = self.rules
|
||||
for idx, rule in enumerate(self.rules):
|
||||
rule.check_conditions.return_value = not bool(idx)
|
||||
|
||||
rules.apply(self.node_info, self.data)
|
||||
|
||||
for idx, rule in enumerate(self.rules):
|
||||
rule.check_conditions.assert_called_once_with(self.node_info,
|
||||
self.data)
|
||||
rule.apply_actions.assert_called_once_with(
|
||||
self.node_info, rollback=bool(idx))
|
||||
|
||||
def test_actions(self, mock_get_all):
|
||||
mock_get_all.return_value = self.rules
|
||||
for idx, rule in enumerate(self.rules):
|
||||
rule.check_conditions.return_value = not bool(idx)
|
||||
|
||||
rules.apply(self.node_info, self.data)
|
||||
|
||||
for idx, rule in enumerate(self.rules):
|
||||
rule.check_conditions.assert_called_once_with(self.node_info,
|
||||
self.data)
|
||||
rule.apply_actions.assert_called_once_with(
|
||||
self.node_info, rollback=bool(idx))
|
||||
|
||||
def test_no_rollback(self, mock_get_all):
|
||||
mock_get_all.return_value = self.rules
|
||||
for rule in self.rules:
|
||||
rule.check_conditions.return_value = True
|
||||
|
||||
rules.apply(self.node_info, self.data)
|
||||
|
||||
for rule in self.rules:
|
||||
rule.check_conditions.assert_called_once_with(self.node_info,
|
||||
self.data)
|
||||
rule.apply_actions.assert_called_once_with(
|
||||
self.node_info, rollback=False)
|
||||
|
||||
def test_only_rollback(self, mock_get_all):
|
||||
mock_get_all.return_value = self.rules
|
||||
for rule in self.rules:
|
||||
rule.check_conditions.return_value = False
|
||||
|
||||
rules.apply(self.node_info, self.data)
|
||||
|
||||
for rule in self.rules:
|
||||
rule.check_conditions.assert_called_once_with(self.node_info,
|
||||
self.data)
|
||||
rule.apply_actions.assert_called_once_with(
|
||||
self.node_info, rollback=True)
|
@ -4,6 +4,8 @@
|
||||
Babel>=1.3
|
||||
eventlet>=0.17.4
|
||||
Flask<1.0,>=0.10
|
||||
jsonpath-rw>=1.2.0,<2.0
|
||||
jsonschema>=2.0.0,<3.0.0,!=2.5.0
|
||||
keystonemiddleware>=2.0.0
|
||||
pbr<2.0,>=1.6
|
||||
python-ironicclient>=0.8.0
|
||||
|
11
setup.cfg
11
setup.cfg
@ -32,6 +32,17 @@ ironic_inspector.hooks.processing =
|
||||
root_device_hint = ironic_inspector.plugins.raid_device:RaidDeviceDetection
|
||||
ironic_inspector.hooks.node_not_found =
|
||||
example = ironic_inspector.plugins.example:example_not_found_hook
|
||||
ironic_inspector.rules.conditions =
|
||||
eq = ironic_inspector.plugins.rules:EqCondition
|
||||
lt = ironic_inspector.plugins.rules:LtCondition
|
||||
gt = ironic_inspector.plugins.rules:GtCondition
|
||||
le = ironic_inspector.plugins.rules:LeCondition
|
||||
ge = ironic_inspector.plugins.rules:GeCondition
|
||||
ne = ironic_inspector.plugins.rules:NeCondition
|
||||
ironic_inspector.rules.actions =
|
||||
example = ironic_inspector.plugins.example:ExampleRuleAction
|
||||
fail = ironic_inspector.plugins.rules:FailAction
|
||||
set-attribute = ironic_inspector.plugins.rules:SetAttributeAction
|
||||
oslo.config.opts =
|
||||
ironic_inspector = ironic_inspector.conf:list_opts
|
||||
ironic_inspector.common.swift = ironic_inspector.common.swift:list_opts
|
||||
|
Loading…
Reference in New Issue
Block a user