[feat] DECKHAND-28: Document pre-validation logic and API integration
This commit constitutes 1 of 2 monolithic ports from Github. The following major changes have been made: - Created schemas for validating different types of documents (control and document schemas), including: * certificate key * certificate * data schema * document * layering policy * passphrase * validation policy - Implemented pre-validation logic which validates that each type of document conforms to the correct schema specifications - Implemented views for APIs -- this allows views to change the DB data to conform with API specifications - Implemented relevant unit tests - Implement functional testing foundation Change-Id: I83582cc26ffef91fbe95d2f5f437f82d6fef6aa9
This commit is contained in:
parent
9bbc767b0a
commit
e1446bb9e1
@ -2,6 +2,6 @@
|
||||
test_command=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \
|
||||
OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \
|
||||
OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-60} \
|
||||
${PYTHON:-python} -m subunit.run discover -t ./ . $LISTOPT $IDOPTION
|
||||
${PYTHON:-python} -m subunit.run discover -t ./ ${OS_TEST_PATH:-./deckhand/tests} $LISTOPT $IDOPTION
|
||||
test_id_option=--load-list $IDFILE
|
||||
test_list_option=--list
|
||||
|
1
AUTHORS
1
AUTHORS
@ -1,4 +1,5 @@
|
||||
Alan Meadows <alan.meadows@gmail.com>
|
||||
Felipe Monteiro <felipe.monteiro@att.com>
|
||||
Felipe Monteiro <fmontei@users.noreply.github.com>
|
||||
Mark Burnett <mark.m.burnett@gmail.com>
|
||||
Scott Hussey <sh8121@att.com>
|
||||
|
66
ChangeLog
66
ChangeLog
@ -1,6 +1,53 @@
|
||||
CHANGES
|
||||
=======
|
||||
|
||||
* Some integration with views/database
|
||||
* Add validations to document db model
|
||||
* Fix up \_is\_abstract in document\_validation
|
||||
* Updated document schema
|
||||
* Resolved merge conflicts
|
||||
* Clean up
|
||||
* Refactor some code
|
||||
* Fixed failing tests.g
|
||||
* WIP: more changes, debugging, tests
|
||||
* Fix unit tests
|
||||
* Remove deprecated code, update deprecated schemas and add new schemas
|
||||
* Add schema validation for validation policy
|
||||
* Changed layers to type string in schemas
|
||||
* f
|
||||
* Add layering policy pre-validation schema
|
||||
* Add layering policy pre-validation schema
|
||||
* Refactor some code
|
||||
* Add endpoint/tests for GET /revisions/{revision\_id}
|
||||
* Fix naming conflict error
|
||||
* Add view abstraction layer for modifying DB data into view data
|
||||
* Raise exception instead of return
|
||||
* Updated /GET revisions response body
|
||||
* Remove old docstring
|
||||
* Update control README (with current response bodies, even though they're a WIP
|
||||
* Return YAML response body
|
||||
* Add endpoint for GET /revisions
|
||||
* Use built-in oslo\_db types for Columns serialized as dicts
|
||||
* Finish retrieving documents by revision\_id, including with filters
|
||||
* Clean up
|
||||
* Test and DB API changes
|
||||
* Add Revision resource
|
||||
* More tests for revisions-api. Fix minor bugs
|
||||
* Clarify layering actions start from full parent data
|
||||
* Add DELETE endpoint
|
||||
* Skip validation for abstract documents & add unit tests
|
||||
* Update schema validation to be internal validation
|
||||
* Update schema/db model/db api to align with design document
|
||||
* Add basic RBAC details to design document
|
||||
* Update documents/revisions relationship/tables
|
||||
* Update revision and document tables and add more unit tests
|
||||
* temp
|
||||
* Revisions database and API implementation
|
||||
* Update API paths for consistency
|
||||
* Add clarifications based on review
|
||||
* Use safe\_load\_all instead of safe\_load
|
||||
* Add unit tests for db documents api
|
||||
* Remove oslo\_versionedobjects
|
||||
* Change application/yaml to application/x-yaml
|
||||
* Cleaned up some logic, added exception handling to document creation
|
||||
* Add currently necessary oslo namespaces to oslo-config-generator conf file
|
||||
@ -10,8 +57,11 @@ CHANGES
|
||||
* Added oslo\_context-based context for oslo\_db compatibility
|
||||
* Update database documents schema
|
||||
* Helper for generating versioned object automatically from dictionary payload
|
||||
* Add description of substitution
|
||||
* Update README
|
||||
* Temporary change - do not commit
|
||||
* Reference Layering section in layeringDefinition description
|
||||
* Add overall layering description
|
||||
* Initial DB API models implementation
|
||||
* Added control (API) readme
|
||||
* [WIP] Implement documents API
|
||||
@ -25,9 +75,25 @@ CHANGES
|
||||
* Add additional documentation
|
||||
* Add jsonschema validation to Deckhand
|
||||
* Initial engine framework
|
||||
* fix typo
|
||||
* Provide a separate rendered-documents endpoint
|
||||
* Move reporting of validation status
|
||||
* Add samples for remaining endpoints
|
||||
* Address some initial review comments
|
||||
* WIP: Add initial design document
|
||||
* Fix incorrect comment
|
||||
* Deckhand initial ORM implementation
|
||||
* Deckhand initial ORM implementation
|
||||
* Add kind param to SchemaVersion class
|
||||
* Change apiVersion references to schemaVersion
|
||||
* Remove apiVersion attribute from substitutions.src attributes
|
||||
* Remove apiVersion attribute from substitutions.src attributes
|
||||
* Update default\_schema with our updated schema definition
|
||||
* Trivial fix to default\_schema
|
||||
* Use regexes for jsonschema pre-validation
|
||||
* Add additional documentation
|
||||
* Add jsonschema validation to Deckhand
|
||||
* Initial engine framework
|
||||
* Add oslo.log integration
|
||||
* DECKHAND-10: Add Barbican integration to Deckhand
|
||||
* Update ChangeLog
|
||||
|
@ -35,4 +35,4 @@ To run locally in a development environment::
|
||||
$ . /var/tmp/deckhand/bin/activate
|
||||
$ sudo pip install .
|
||||
$ sudo python setup.py install
|
||||
$ uwsgi --http :9000 -w deckhand.deckhand --callable deckhand_callable --enable-threads -L
|
||||
$ uwsgi --http :9000 -w deckhand.cmd --callable deckhand_callable --enable-threads -L
|
||||
|
@ -12,7 +12,7 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
from .control import api
|
||||
from deckhand.control import api
|
||||
|
||||
|
||||
def start_deckhand():
|
@ -161,7 +161,7 @@ Document creation can be tested locally using (from root deckhand directory):
|
||||
|
||||
$ curl -i -X POST localhost:9000/api/v1.0/documents \
|
||||
-H "Content-Type: application/x-yaml" \
|
||||
--data-binary "@deckhand/tests/unit/resources/sample.yaml"
|
||||
--data-binary "@deckhand/tests/unit/resources/sample_document.yaml"
|
||||
|
||||
# revision_id copy/pasted from previous response.
|
||||
$ curl -i -X GET localhost:9000/api/v1.0/revisions/0e99c8b9-bab4-4fc7-8405-7dbd22c33a30/documents
|
||||
|
@ -50,13 +50,15 @@ class DocumentsResource(api_base.BaseResource):
|
||||
# All concrete documents in the payload must successfully pass their
|
||||
# JSON schema validations. Otherwise raise an error.
|
||||
try:
|
||||
for doc in documents:
|
||||
document_validation.DocumentValidation(doc).pre_validate()
|
||||
except deckhand_errors.InvalidFormat as e:
|
||||
validation_policies = document_validation.DocumentValidation(
|
||||
documents).validate_all()
|
||||
except (deckhand_errors.InvalidDocumentFormat,
|
||||
deckhand_errors.UnknownDocumentFormat) as e:
|
||||
return self.return_error(resp, falcon.HTTP_400, message=e)
|
||||
|
||||
try:
|
||||
created_documents = db_api.documents_create(documents)
|
||||
created_documents = db_api.documents_create(
|
||||
documents, validation_policies)
|
||||
except db_exc.DBDuplicateEntry as e:
|
||||
return self.return_error(resp, falcon.HTTP_409, message=e)
|
||||
except Exception as e:
|
||||
|
@ -16,32 +16,53 @@ from deckhand.control import common
|
||||
|
||||
|
||||
class ViewBuilder(common.ViewBuilder):
|
||||
"""Model revision API responses as a python dictionary."""
|
||||
"""Model revision API responses as a python dictionary."""
|
||||
|
||||
_collection_name = 'revisions'
|
||||
_collection_name = 'revisions'
|
||||
|
||||
def list(self, revisions):
|
||||
resp_body = {
|
||||
'count': len(revisions),
|
||||
'next': None,
|
||||
'prev': None,
|
||||
'results': []
|
||||
}
|
||||
def list(self, revisions):
|
||||
resp_body = {
|
||||
'count': len(revisions),
|
||||
'results': []
|
||||
}
|
||||
|
||||
for revision in revisions:
|
||||
result = {}
|
||||
for attr in ('id', 'created_at'):
|
||||
result[common.to_camel_case(attr)] = revision[attr]
|
||||
result['count'] = len(revision.pop('documents'))
|
||||
resp_body['results'].append(result)
|
||||
for revision in revisions:
|
||||
result = {}
|
||||
for attr in ('id', 'created_at'):
|
||||
result[common.to_camel_case(attr)] = revision[attr]
|
||||
result['count'] = len(revision.pop('documents'))
|
||||
resp_body['results'].append(result)
|
||||
|
||||
return resp_body
|
||||
return resp_body
|
||||
|
||||
def show(self, revision):
|
||||
return {
|
||||
'id': revision.get('id'),
|
||||
'createdAt': revision.get('created_at'),
|
||||
'url': self._gen_url(revision),
|
||||
# TODO: Not yet implemented.
|
||||
'validationPolicies': [],
|
||||
}
|
||||
def show(self, revision):
|
||||
"""Generate view for showing revision details.
|
||||
|
||||
Each revision's documents should only be validation policies.
|
||||
"""
|
||||
validation_policies = []
|
||||
success_status = 'success'
|
||||
|
||||
for vp in revision['validation_policies']:
|
||||
validation_policy = {}
|
||||
validation_policy['name'] = vp.get('name')
|
||||
validation_policy['url'] = self._gen_url(vp)
|
||||
try:
|
||||
validation_policy['status'] = vp['data']['validations'][0][
|
||||
'status']
|
||||
except KeyError:
|
||||
validation_policy['status'] = 'unknown'
|
||||
|
||||
validation_policies.append(validation_policy)
|
||||
|
||||
if validation_policy['status'] != 'success':
|
||||
success_status = 'failed'
|
||||
|
||||
return {
|
||||
'id': revision.get('id'),
|
||||
'createdAt': revision.get('created_at'),
|
||||
'url': self._gen_url(revision),
|
||||
# TODO: Not yet implemented.
|
||||
'validationPolicies': validation_policies,
|
||||
'status': success_status
|
||||
}
|
||||
|
@ -38,6 +38,7 @@ import sqlalchemy.sql as sa_sql
|
||||
|
||||
from deckhand.db.sqlalchemy import models
|
||||
from deckhand import errors
|
||||
from deckhand import types
|
||||
from deckhand import utils
|
||||
|
||||
sa_logger = None
|
||||
@ -111,18 +112,31 @@ def drop_db():
|
||||
models.unregister_models(get_engine())
|
||||
|
||||
|
||||
def documents_create(documents, session=None):
|
||||
"""Create a set of documents."""
|
||||
created_docs = [document_create(doc, session) for doc in documents]
|
||||
return created_docs
|
||||
def documents_create(documents, validation_policies, session=None):
|
||||
session = session or get_session()
|
||||
|
||||
documents_created = _documents_create(documents, session)
|
||||
val_policies_created = _documents_create(validation_policies, session)
|
||||
all_docs_created = documents_created + val_policies_created
|
||||
|
||||
if all_docs_created:
|
||||
revision = revision_create()
|
||||
for doc in all_docs_created:
|
||||
with session.begin():
|
||||
doc['revision_id'] = revision['id']
|
||||
doc.save(session=session)
|
||||
|
||||
return [d.to_dict() for d in documents_created]
|
||||
|
||||
|
||||
def documents_create(values_list, session=None):
|
||||
def _documents_create(values_list, session=None):
|
||||
"""Create a set of documents and associated schema.
|
||||
|
||||
If no changes are detected, a new revision will not be created. This
|
||||
allows services to periodically re-register their schemas without
|
||||
creating unnecessary revisions.
|
||||
|
||||
:param values_list: List of documents to be saved.
|
||||
"""
|
||||
values_list = copy.deepcopy(values_list)
|
||||
session = session or get_session()
|
||||
@ -138,17 +152,24 @@ def documents_create(values_list, session=None):
|
||||
return True
|
||||
return False
|
||||
|
||||
def _get_model(schema):
|
||||
if schema == types.LAYERING_POLICY_SCHEMA:
|
||||
return models.LayeringPolicy()
|
||||
elif schema == types.VALIDATION_POLICY_SCHEMA:
|
||||
return models.ValidationPolicy()
|
||||
else:
|
||||
return models.Document()
|
||||
|
||||
def _document_create(values):
|
||||
document = models.Document()
|
||||
document = _get_model(values['schema'])
|
||||
with session.begin():
|
||||
document.update(values)
|
||||
document.save(session=session)
|
||||
return document.to_dict()
|
||||
return document
|
||||
|
||||
for values in values_list:
|
||||
values['_metadata'] = values.pop('metadata')
|
||||
values['name'] = values['_metadata']['name']
|
||||
|
||||
|
||||
try:
|
||||
existing_document = document_get(
|
||||
raw_dict=True,
|
||||
@ -164,10 +185,7 @@ def documents_create(values_list, session=None):
|
||||
do_create = True
|
||||
|
||||
if do_create:
|
||||
revision = revision_create()
|
||||
|
||||
for values in values_list:
|
||||
values['revision_id'] = revision['id']
|
||||
doc = _document_create(values)
|
||||
documents_created.append(doc)
|
||||
|
||||
@ -198,11 +216,13 @@ def revision_get(revision_id, session=None):
|
||||
:raises: RevisionNotFound if the revision was not found.
|
||||
"""
|
||||
session = session or get_session()
|
||||
|
||||
try:
|
||||
revision = session.query(models.Revision).filter_by(
|
||||
id=revision_id).one().to_dict()
|
||||
except sa_orm.exc.NoResultFound:
|
||||
raise errors.RevisionNotFound(revision=revision_id)
|
||||
|
||||
return revision
|
||||
|
||||
|
||||
|
@ -39,7 +39,7 @@ BASE = declarative.declarative_base()
|
||||
class DeckhandBase(models.ModelBase, models.TimestampMixin):
|
||||
"""Base class for Deckhand Models."""
|
||||
|
||||
__table_args__ = {'mysql_engine': 'InnoDB', 'mysql_charset': 'utf8'}
|
||||
__table_args__ = {'mysql_engine': 'Postgre', 'mysql_charset': 'utf8'}
|
||||
__table_initialized__ = False
|
||||
__protected_attributes__ = set([
|
||||
"created_at", "updated_at", "deleted_at", "deleted"])
|
||||
@ -70,7 +70,12 @@ class DeckhandBase(models.ModelBase, models.TimestampMixin):
|
||||
def items(self):
|
||||
return self.__dict__.items()
|
||||
|
||||
def to_dict(self):
|
||||
def to_dict(self, raw_dict=False):
|
||||
"""Conver the object into dictionary format.
|
||||
|
||||
:param raw_dict: if True, returns unmodified data; else returns data
|
||||
expected by users.
|
||||
"""
|
||||
d = self.__dict__.copy()
|
||||
# Remove private state instance, as it is not serializable and causes
|
||||
# CircularReference.
|
||||
@ -83,11 +88,16 @@ class DeckhandBase(models.ModelBase, models.TimestampMixin):
|
||||
if k in d and d[k]:
|
||||
d[k] = d[k].isoformat()
|
||||
|
||||
# NOTE(fmontei): ``metadata`` is reserved by the DB, so ``_metadata``
|
||||
# must be used to store document metadata information in the DB.
|
||||
if not raw_dict and '_metadata' in self.keys():
|
||||
d['metadata'] = d['_metadata']
|
||||
|
||||
return d
|
||||
|
||||
@staticmethod
|
||||
def gen_unqiue_contraint(self, *fields):
|
||||
constraint_name = 'ix_' + self.__class__.__name__.lower() + '_'
|
||||
def gen_unqiue_contraint(*fields):
|
||||
constraint_name = 'ix_' + DeckhandBase.__name__.lower() + '_'
|
||||
for field in fields:
|
||||
constraint_name = constraint_name + '_%s' % field
|
||||
return schema.UniqueConstraint(*fields, name=constraint_name)
|
||||
@ -98,56 +108,74 @@ class Revision(BASE, DeckhandBase):
|
||||
|
||||
id = Column(String(36), primary_key=True,
|
||||
default=lambda: str(uuid.uuid4()))
|
||||
parent_id = Column(Integer, ForeignKey('revisions.id'), nullable=True)
|
||||
child_id = Column(Integer, ForeignKey('revisions.id'), nullable=True)
|
||||
results = Column(oslo_types.JsonEncodedList(), nullable=True)
|
||||
|
||||
documents = relationship("Document")
|
||||
validation_policies = relationship("ValidationPolicy")
|
||||
|
||||
def to_dict(self):
|
||||
d = super(Revision, self).to_dict()
|
||||
d['documents'] = [doc.to_dict() for doc in self.documents]
|
||||
d['validation_policies'] = [
|
||||
vp.to_dict() for vp in self.validation_policies]
|
||||
return d
|
||||
|
||||
|
||||
class Document(BASE, DeckhandBase):
|
||||
class DocumentMixin(object):
|
||||
"""Mixin class for sharing common columns across all document resources
|
||||
such as documents themselves, layering policies and validation policies."""
|
||||
|
||||
name = Column(String(64), nullable=False)
|
||||
schema = Column(String(64), nullable=False)
|
||||
# NOTE: Do not define a maximum length for these JSON data below. However,
|
||||
# this approach is not compatible with all database types.
|
||||
# "metadata" is reserved, so use "_metadata" instead.
|
||||
_metadata = Column(oslo_types.JsonEncodedDict(), nullable=False)
|
||||
data = Column(oslo_types.JsonEncodedDict(), nullable=False)
|
||||
|
||||
@declarative.declared_attr
|
||||
def revision_id(cls):
|
||||
return Column(Integer, ForeignKey('revisions.id'), nullable=False)
|
||||
|
||||
|
||||
class Document(BASE, DeckhandBase, DocumentMixin):
|
||||
UNIQUE_CONSTRAINTS = ('schema', 'name', 'revision_id')
|
||||
__tablename__ = 'documents'
|
||||
__table_args__ = (DeckhandBase.gen_unqiue_contraint(*UNIQUE_CONSTRAINTS),)
|
||||
|
||||
id = Column(String(36), primary_key=True,
|
||||
default=lambda: str(uuid.uuid4()))
|
||||
schema = Column(String(64), nullable=False)
|
||||
name = Column(String(64), nullable=False)
|
||||
# NOTE: Do not define a maximum length for these JSON data below. However,
|
||||
# this approach is not compatible with all database types.
|
||||
# "metadata" is reserved, so use "_metadata" instead.
|
||||
_metadata = Column(oslo_types.JsonEncodedDict(), nullable=False)
|
||||
data = Column(oslo_types.JsonEncodedDict(), nullable=False)
|
||||
revision_id = Column(Integer, ForeignKey('revisions.id'), nullable=False)
|
||||
|
||||
def to_dict(self, raw_dict=False):
|
||||
"""Convert the ``Document`` object into a dictionary format.
|
||||
|
||||
:param raw_dict: if True, returns unmodified data; else returns data
|
||||
expected by users.
|
||||
:returns: dictionary format of ``Document`` object.
|
||||
"""
|
||||
d = super(Document, self).to_dict()
|
||||
# ``_metadata`` is used in the DB schema as ``metadata`` is reserved.
|
||||
if not raw_dict:
|
||||
d['metadata'] = d.pop('_metadata')
|
||||
return d
|
||||
class LayeringPolicy(BASE, DeckhandBase, DocumentMixin):
|
||||
|
||||
# NOTE(fmontei): Only one layering policy can exist per revision, so
|
||||
# enforce this constraint at the DB level.
|
||||
UNIQUE_CONSTRAINTS = ('revision_id',)
|
||||
__tablename__ = 'layering_policies'
|
||||
__table_args__ = (DeckhandBase.gen_unqiue_contraint(*UNIQUE_CONSTRAINTS),)
|
||||
|
||||
id = Column(String(36), primary_key=True,
|
||||
default=lambda: str(uuid.uuid4()))
|
||||
|
||||
|
||||
class ValidationPolicy(BASE, DeckhandBase, DocumentMixin):
|
||||
|
||||
UNIQUE_CONSTRAINTS = ('schema', 'name', 'revision_id')
|
||||
__tablename__ = 'validation_policies'
|
||||
__table_args__ = (DeckhandBase.gen_unqiue_contraint(*UNIQUE_CONSTRAINTS),)
|
||||
|
||||
id = Column(String(36), primary_key=True,
|
||||
default=lambda: str(uuid.uuid4()))
|
||||
|
||||
|
||||
def register_models(engine):
|
||||
"""Create database tables for all models with the given engine."""
|
||||
models = [Document]
|
||||
models = [Document, Revision, LayeringPolicy, ValidationPolicy]
|
||||
for model in models:
|
||||
model.metadata.create_all(engine)
|
||||
|
||||
|
||||
def unregister_models(engine):
|
||||
"""Drop database tables for all models with the given engine."""
|
||||
models = [Document]
|
||||
models = [Document, Revision, LayeringPolicy, ValidationPolicy]
|
||||
for model in models:
|
||||
model.metadata.drop_all(engine)
|
||||
|
@ -14,11 +14,12 @@
|
||||
|
||||
import jsonschema
|
||||
from oslo_log import log as logging
|
||||
import six
|
||||
|
||||
from deckhand.engine.schema.v1_0 import default_policy_validation
|
||||
from deckhand.engine.schema.v1_0 import default_schema_validation
|
||||
from deckhand.engine.schema import base_schema
|
||||
from deckhand.engine.schema import v1_0
|
||||
from deckhand import errors
|
||||
from deckhand import factories
|
||||
from deckhand import types
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
@ -26,74 +27,146 @@ LOG = logging.getLogger(__name__)
|
||||
class DocumentValidation(object):
|
||||
"""Class for document validation logic for YAML files.
|
||||
|
||||
This class is responsible for performing built-in validations on Documents.
|
||||
This class is responsible for validating YAML files according to their
|
||||
schema.
|
||||
|
||||
:param data: YAML data that requires secrets to be validated, merged and
|
||||
consolidated.
|
||||
:param documents: Documents to be validated.
|
||||
:type documents: List of dictionaries or dictionary.
|
||||
"""
|
||||
|
||||
def __init__(self, data):
|
||||
self.data = data
|
||||
def __init__(self, documents):
|
||||
if not isinstance(documents, (list, tuple)):
|
||||
documents = [documents]
|
||||
|
||||
class SchemaVersion(object):
|
||||
self.documents = documents
|
||||
|
||||
class SchemaType(object):
|
||||
"""Class for retrieving correct schema for pre-validation on YAML.
|
||||
|
||||
Retrieves the schema that corresponds to "apiVersion" in the YAML
|
||||
data. This schema is responsible for performing pre-validation on
|
||||
YAML data.
|
||||
|
||||
The built-in validation schemas that are always executed include:
|
||||
|
||||
- `deckhand-document-schema-validation`
|
||||
- `deckhand-policy-validation`
|
||||
"""
|
||||
|
||||
# TODO: Use the correct validation based on the Document's schema.
|
||||
internal_validations = [
|
||||
{'version': 'v1', 'fqn': 'deckhand-document-schema-validation',
|
||||
'schema': default_schema_validation},
|
||||
{'version': 'v1', 'fqn': 'deckhand-policy-validation',
|
||||
'schema': default_policy_validation}]
|
||||
# TODO(fmontei): Support dynamically registered schemas.
|
||||
schema_versions_info = [
|
||||
{'id': 'deckhand/CertificateKey',
|
||||
'schema': v1_0.certificate_key_schema},
|
||||
{'id': 'deckhand/Certificate',
|
||||
'schema': v1_0.certificate_schema},
|
||||
{'id': 'deckhand/DataSchema',
|
||||
'schema': v1_0.data_schema},
|
||||
# NOTE(fmontei): Fall back to the metadata's schema for validating
|
||||
# generic documents.
|
||||
{'id': 'metadata/Document',
|
||||
'schema': v1_0.document_schema},
|
||||
{'id': 'deckhand/LayeringPolicy',
|
||||
'schema': v1_0.layering_schema},
|
||||
{'id': 'deckhand/Passphrase',
|
||||
'schema': v1_0.passphrase_schema},
|
||||
{'id': 'deckhand/ValidationPolicy',
|
||||
'schema': v1_0.validation_schema}]
|
||||
|
||||
def __init__(self, schema_version):
|
||||
self.schema_version = schema_version
|
||||
def __init__(self, data):
|
||||
"""Constructor for ``SchemaType``.
|
||||
|
||||
@property
|
||||
def schema(self):
|
||||
# TODO: return schema based on Document's schema.
|
||||
return [v['schema'] for v in self.internal_validations
|
||||
if v['version'] == self.schema_version][0].schema
|
||||
Retrieve the relevant schema based on the API version and schema
|
||||
name contained in `document.schema` where `document` constitutes a
|
||||
single document in a YAML payload.
|
||||
|
||||
def pre_validate(self):
|
||||
"""Pre-validate that the YAML file is correctly formatted."""
|
||||
self._validate_with_schema()
|
||||
:param api_version: The API version used for schema validation.
|
||||
:param schema: The schema property in `document.schema`.
|
||||
"""
|
||||
self.schema = self.get_schema(data)
|
||||
|
||||
def _validate_with_schema(self):
|
||||
# Validate the document using the document's ``schema``. Only validate
|
||||
# concrete documents.
|
||||
def get_schema(self, data):
|
||||
# Fall back to `document.metadata.schema` if the schema cannot be
|
||||
# determined from `data.schema`.
|
||||
for doc_property in [data['schema'], data['metadata']['schema']]:
|
||||
schema = self._get_schema_by_property(doc_property)
|
||||
if schema:
|
||||
return schema
|
||||
return None
|
||||
|
||||
def _get_schema_by_property(self, doc_property):
|
||||
schema_parts = doc_property.split('/')
|
||||
doc_schema_identifier = '/'.join(schema_parts[:-1])
|
||||
|
||||
for schema in self.schema_versions_info:
|
||||
if doc_schema_identifier == schema['id']:
|
||||
return schema['schema'].schema
|
||||
return None
|
||||
|
||||
def validate_all(self):
|
||||
"""Pre-validate that the YAML file is correctly formatted.
|
||||
|
||||
All concrete documents in the revision successfully pass their JSON
|
||||
schema validations. The result of the validation is stored under
|
||||
the "deckhand-document-schema-validation" validation namespace for
|
||||
a document revision.
|
||||
|
||||
Validation is broken up into 2 stages:
|
||||
|
||||
1) Validate that each document contains the basic bulding blocks
|
||||
needed: "schema", "metadata" and "data" using a "base" schema.
|
||||
2) Validate each specific document type (e.g. validation policy)
|
||||
using a more detailed schema.
|
||||
|
||||
:returns: Dictionary mapping with keys being the unique name for each
|
||||
document and values being the validations executed for that
|
||||
document, including failed and succeeded validations.
|
||||
"""
|
||||
internal_validation_docs = []
|
||||
validation_policy_factory = factories.ValidationPolicyFactory()
|
||||
|
||||
for document in self.documents:
|
||||
document_validations = self._validate_one(document)
|
||||
|
||||
deckhand_schema_validation = validation_policy_factory.gen(
|
||||
types.DECKHAND_SCHEMA_VALIDATION, status='success')
|
||||
internal_validation_docs.append(deckhand_schema_validation)
|
||||
|
||||
return internal_validation_docs
|
||||
|
||||
def _validate_one(self, document):
|
||||
# Subject every document to basic validation to verify that each
|
||||
# main section is present (schema, metadata, data).
|
||||
try:
|
||||
abstract = self.data['metadata']['layeringDefinition'][
|
||||
'abstract']
|
||||
is_abstract = six.text_type(abstract).lower() == 'true'
|
||||
except KeyError as e:
|
||||
raise errors.InvalidFormat(
|
||||
"Could not find 'abstract' property from document.")
|
||||
|
||||
# TODO: This should be done inside a different module.
|
||||
if is_abstract:
|
||||
LOG.info(
|
||||
"Skipping validation for the document because it is abstract")
|
||||
return
|
||||
|
||||
try:
|
||||
schema_version = self.data['schema'].split('/')[-1]
|
||||
doc_schema_version = self.SchemaVersion(schema_version)
|
||||
except (AttributeError, IndexError, KeyError) as e:
|
||||
raise errors.InvalidFormat(
|
||||
'The provided schema is invalid or missing. Exception: '
|
||||
'%s.' % e)
|
||||
try:
|
||||
jsonschema.validate(self.data, doc_schema_version.schema)
|
||||
jsonschema.validate(document, base_schema.schema)
|
||||
except jsonschema.exceptions.ValidationError as e:
|
||||
raise errors.InvalidFormat('The provided YAML file is invalid. '
|
||||
'Exception: %s.' % e.message)
|
||||
raise errors.InvalidDocumentFormat(
|
||||
detail=e.message, schema=e.schema)
|
||||
|
||||
doc_schema_type = self.SchemaType(document)
|
||||
if doc_schema_type.schema is None:
|
||||
raise errors.UknownDocumentFormat(
|
||||
document_type=document['schema'])
|
||||
|
||||
# Perform more detailed validation on each document depending on
|
||||
# its schema. If the document is abstract, validation errors are
|
||||
# ignored.
|
||||
try:
|
||||
jsonschema.validate(document, doc_schema_type.schema)
|
||||
except jsonschema.exceptions.ValidationError as e:
|
||||
# TODO(fmontei): Use the `Document` object wrapper instead
|
||||
# once other PR is merged.
|
||||
if not self._is_abstract(document):
|
||||
raise errors.InvalidDocumentFormat(
|
||||
detail=e.message, schema=e.schema,
|
||||
document_type=document['schema'])
|
||||
else:
|
||||
LOG.info('Skipping schema validation for abstract '
|
||||
'document: %s.' % document)
|
||||
|
||||
def _is_abstract(self, document):
|
||||
try:
|
||||
is_abstract = document['metadata']['layeringDefinition'][
|
||||
'abstract'] == True
|
||||
return is_abstract
|
||||
# NOTE(fmontei): If the document is of ``document_schema`` type and
|
||||
# no "layeringDefinition" or "abstract" property is found, then treat
|
||||
# this as a validation error.
|
||||
except KeyError:
|
||||
doc_schema_type = self.SchemaType(document)
|
||||
return doc_schema_type is v1_0.document_schema
|
||||
return False
|
||||
|
36
deckhand/engine/schema/base_schema.py
Normal file
36
deckhand/engine/schema/base_schema.py
Normal file
@ -0,0 +1,36 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
schema = {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
# Currently supported versions include v1 only.
|
||||
'pattern': '^([A-Za-z]+\/[A-Za-z]+\/v[1]{1}\.[0]{1})$'
|
||||
},
|
||||
'metadata': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {'type': 'string'},
|
||||
'name': {'type': 'string'}
|
||||
},
|
||||
'additionalProperties': True,
|
||||
'required': ['schema', 'name']
|
||||
},
|
||||
'data': {'type': ['string', 'object']}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'metadata', 'data']
|
||||
}
|
@ -0,0 +1,25 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
from deckhand.engine.schema.v1_0 import certificate_key_schema
|
||||
from deckhand.engine.schema.v1_0 import certificate_schema
|
||||
from deckhand.engine.schema.v1_0 import data_schema
|
||||
from deckhand.engine.schema.v1_0 import document_schema
|
||||
from deckhand.engine.schema.v1_0 import layering_schema
|
||||
from deckhand.engine.schema.v1_0 import passphrase_schema
|
||||
from deckhand.engine.schema.v1_0 import validation_schema
|
||||
|
||||
__all__ = ['certificate_key_schema', 'certificate_schema', 'data_schema',
|
||||
'document_schema', 'layering_schema', 'passphrase_schema',
|
||||
'validation_schema']
|
42
deckhand/engine/schema/v1_0/certificate_key_schema.py
Normal file
42
deckhand/engine/schema/v1_0/certificate_key_schema.py
Normal file
@ -0,0 +1,42 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
schema = {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
'pattern': '^(deckhand/CertificateKey/v[1]{1}\.[0]{1})$'
|
||||
},
|
||||
'metadata': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
'pattern': '^(metadata/Document/v[1]{1}\.[0]{1})$',
|
||||
},
|
||||
'name': {'type': 'string'},
|
||||
'storagePolicy': {
|
||||
'type': 'string',
|
||||
'pattern': '^(encrypted)$'
|
||||
}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'name', 'storagePolicy']
|
||||
},
|
||||
'data': {'type': 'string'}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'metadata', 'data']
|
||||
}
|
42
deckhand/engine/schema/v1_0/certificate_schema.py
Normal file
42
deckhand/engine/schema/v1_0/certificate_schema.py
Normal file
@ -0,0 +1,42 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
schema = {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
'pattern': '^(deckhand/Certificate/v[1]{1}\.[0]{1})$'
|
||||
},
|
||||
'metadata': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
'pattern': '^(metadata/Document/v[1]{1}\.[0]{1})$',
|
||||
},
|
||||
'name': {'type': 'string'},
|
||||
'storagePolicy': {
|
||||
'type': 'string',
|
||||
'pattern': '^(cleartext)$'
|
||||
}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'name', 'storagePolicy']
|
||||
},
|
||||
'data': {'type': 'string'}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'metadata', 'data']
|
||||
}
|
54
deckhand/engine/schema/v1_0/data_schema.py
Normal file
54
deckhand/engine/schema/v1_0/data_schema.py
Normal file
@ -0,0 +1,54 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
# This specifies the official JSON schema meta-schema. DataSchema documents
|
||||
# are used by various services to register new schemas that Deckhand can use
|
||||
# for validation.
|
||||
schema = {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
'pattern': '^(deckhand/DataSchema/v[1]{1}\.[0]{1})$'
|
||||
},
|
||||
'metadata': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
'pattern': '^(metadata/Control/v[1]{1}\.[0]{1})$'
|
||||
},
|
||||
'name': {'type': 'string'},
|
||||
# Labels are optional.
|
||||
'labels': {
|
||||
'type': 'object'
|
||||
}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'name']
|
||||
},
|
||||
'data': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'$schema': {
|
||||
'type': 'string'
|
||||
}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['$schema']
|
||||
}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'metadata', 'data']
|
||||
}
|
@ -18,10 +18,10 @@ substitution_schema = {
|
||||
'dest': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'path': {'type': 'string'}
|
||||
'path': {'type': 'string'},
|
||||
'pattern': {'type': 'string'}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
# 'replacePattern' is not required.
|
||||
'required': ['path']
|
||||
},
|
||||
'src': {
|
||||
@ -44,35 +44,32 @@ schema = {
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
'pattern': '^(.*\/v[0-9]{1})$'
|
||||
'pattern': '^([A-Za-z]+/[A-Za-z]+/v[1]{1}\.[0]{1})$'
|
||||
},
|
||||
'metadata': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
'pattern': '^(.*/v[0-9]{1})$'
|
||||
'pattern': '^(metadata/Document/v[1]{1}\.[0]{1})$'
|
||||
},
|
||||
'name': {'type': 'string'},
|
||||
'storagePolicy': {'type': 'string'},
|
||||
'labels': {
|
||||
'type': 'object'
|
||||
},
|
||||
'labels': {'type': 'object'},
|
||||
'layeringDefinition': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'layer': {'type': 'string'},
|
||||
'abstract': {'type': 'boolean'},
|
||||
'parentSelector': {
|
||||
'type': 'object'
|
||||
},
|
||||
# "parentSelector" is optional.
|
||||
'parentSelector': {'type': 'object'},
|
||||
# "actions" is optional.
|
||||
'actions': {
|
||||
'type': 'array',
|
||||
'items': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'method': {'enum': ['merge', 'delete',
|
||||
'replace']},
|
||||
'method': {'enum': ['replace', 'delete',
|
||||
'merge']},
|
||||
'path': {'type': 'string'}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
@ -81,16 +78,16 @@ schema = {
|
||||
}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['layer', 'abstract', 'parentSelector']
|
||||
'required': ['layer', 'abstract']
|
||||
},
|
||||
# "substitutions" is optional.
|
||||
'substitutions': {
|
||||
'type': 'array',
|
||||
'items': substitution_schema
|
||||
}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'name', 'storagePolicy', 'labels',
|
||||
'layeringDefinition', 'substitutions']
|
||||
'required': ['schema', 'name', 'layeringDefinition']
|
||||
},
|
||||
'data': {
|
||||
'type': 'object'
|
48
deckhand/engine/schema/v1_0/layering_schema.py
Normal file
48
deckhand/engine/schema/v1_0/layering_schema.py
Normal file
@ -0,0 +1,48 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
schema = {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
'pattern': '^(deckhand/LayeringPolicy/v[1]{1}\.[0]{1})$'
|
||||
},
|
||||
'metadata': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
'pattern': '^(metadata/Control/v[1]{1}\.[0]{1})$'
|
||||
},
|
||||
'name': {'type': 'string'}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'name']
|
||||
},
|
||||
'data': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'layerOrder': {
|
||||
'type': 'array',
|
||||
'items': {'type': 'string'}
|
||||
}
|
||||
},
|
||||
'additionalProperties': True,
|
||||
'required': ['layerOrder']
|
||||
}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'metadata', 'data']
|
||||
}
|
42
deckhand/engine/schema/v1_0/passphrase_schema.py
Normal file
42
deckhand/engine/schema/v1_0/passphrase_schema.py
Normal file
@ -0,0 +1,42 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
schema = {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
'pattern': '^(deckhand/Passphrase/v[1]{1}\.[0]{1})$'
|
||||
},
|
||||
'metadata': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
'pattern': '^(metadata/Document/v[1]{1}\.[0]{1})$',
|
||||
},
|
||||
'name': {'type': 'string'},
|
||||
'storagePolicy': {
|
||||
'type': 'string',
|
||||
'pattern': '^(encrypted)$'
|
||||
}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'name', 'storagePolicy']
|
||||
},
|
||||
'data': {'type': 'string'}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'metadata', 'data']
|
||||
}
|
60
deckhand/engine/schema/v1_0/validation_schema.py
Normal file
60
deckhand/engine/schema/v1_0/validation_schema.py
Normal file
@ -0,0 +1,60 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
schema = {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
'pattern': '^(deckhand/ValidationPolicy/v[1]{1}\.[0]{1})$'
|
||||
},
|
||||
'metadata': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'schema': {
|
||||
'type': 'string',
|
||||
'pattern': '^(metadata/Control/v[1]{1}\.[0]{1})$'
|
||||
},
|
||||
'name': {'type': 'string'}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'name']
|
||||
},
|
||||
'data': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'validations': {
|
||||
'type': 'array',
|
||||
'items': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'name': {
|
||||
'type': 'string',
|
||||
'pattern': '^.*-(validation|verification)$'
|
||||
},
|
||||
# 'expiresAfter' is optional.
|
||||
'expiresAfter': {'type': 'string'}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['name']
|
||||
}
|
||||
}
|
||||
},
|
||||
'additionalProperties': True,
|
||||
'required': ['validations']
|
||||
}
|
||||
},
|
||||
'additionalProperties': False,
|
||||
'required': ['schema', 'metadata', 'data']
|
||||
}
|
@ -45,17 +45,23 @@ class DeckhandException(Exception):
|
||||
return self.args[0]
|
||||
|
||||
|
||||
class ApiError(Exception):
|
||||
pass
|
||||
class InvalidDocumentFormat(DeckhandException):
|
||||
msg_fmt = ("The provided YAML failed schema validation. Details: "
|
||||
"%(detail)s. Schema: %(schema)s.")
|
||||
alt_msg_fmt = ("The provided %(document_type)s YAML failed schema "
|
||||
"validation. Details: %(detail)s. Schema: %(schema)s.")
|
||||
|
||||
def __init__(self, document_type=None, **kwargs):
|
||||
if document_type:
|
||||
self.msg_fmt = self.alt_msg_fmt
|
||||
kwargs.update({'document_type': document_type})
|
||||
super(InvalidDocumentFormat, self).__init__(**kwargs)
|
||||
|
||||
|
||||
class InvalidFormat(ApiError):
|
||||
"""The YAML file is incorrectly formatted and cannot be read."""
|
||||
|
||||
|
||||
class DocumentExists(DeckhandException):
|
||||
msg_fmt = ("Document with kind %(kind)s and schemaVersion "
|
||||
"%(schema_version)s already exists.")
|
||||
class UnknownDocumentFormat(DeckhandException):
|
||||
msg_fmt = ("Could not determine the validation schema to validate the "
|
||||
"document type: %(document_type)s.")
|
||||
code = 400
|
||||
|
||||
|
||||
class RevisionNotFound(DeckhandException):
|
||||
|
115
deckhand/factories.py
Normal file
115
deckhand/factories.py
Normal file
@ -0,0 +1,115 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import abc
|
||||
import copy
|
||||
|
||||
from oslo_log import log as logging
|
||||
|
||||
from deckhand.tests import test_utils
|
||||
from deckhand import types
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class DeckhandFactory(object):
|
||||
__metaclass__ = abc.ABCMeta
|
||||
|
||||
@abc.abstractmethod
|
||||
def gen(self, *args):
|
||||
pass
|
||||
|
||||
@abc.abstractmethod
|
||||
def gen_test(self, *args, **kwargs):
|
||||
pass
|
||||
|
||||
|
||||
class ValidationPolicyFactory(DeckhandFactory):
|
||||
"""Class for auto-generating validation policy templates for testing."""
|
||||
|
||||
VALIDATION_POLICY_TEMPLATE = {
|
||||
"data": {
|
||||
"validations": []
|
||||
},
|
||||
"metadata": {
|
||||
"schema": "metadata/Control/v1",
|
||||
"name": ""
|
||||
},
|
||||
"schema": types.VALIDATION_POLICY_SCHEMA
|
||||
}
|
||||
|
||||
def __init__(self):
|
||||
"""Constructor for ``ValidationPolicyFactory``.
|
||||
|
||||
Returns a template whose YAML representation is of the form::
|
||||
|
||||
---
|
||||
schema: deckhand/ValidationPolicy/v1
|
||||
metadata:
|
||||
schema: metadata/Control/v1
|
||||
name: site-deploy-ready
|
||||
data:
|
||||
validations:
|
||||
- name: deckhand-schema-validation
|
||||
- name: drydock-site-validation
|
||||
expiresAfter: P1W
|
||||
- name: promenade-site-validation
|
||||
expiresAfter: P1W
|
||||
- name: armada-deployability-validation
|
||||
...
|
||||
"""
|
||||
pass
|
||||
|
||||
def gen(self, validation_type, status):
|
||||
if validation_type not in types.DECKHAND_VALIDATION_TYPES:
|
||||
raise ValueError("The validation type must be in %s."
|
||||
% types.DECKHAND_VALIDATION_TYPES)
|
||||
|
||||
validation_policy_template = copy.deepcopy(
|
||||
self.VALIDATION_POLICY_TEMPLATE)
|
||||
|
||||
validation_policy_template['metadata'][
|
||||
'name'] = validation_type
|
||||
validation_policy_template['data']['validations'] = [
|
||||
{'name': validation_type, 'status': status}
|
||||
]
|
||||
|
||||
return validation_policy_template
|
||||
|
||||
def gen_test(self, name=None, num_validations=None):
|
||||
"""Generate the test document template.
|
||||
|
||||
Generate the document template based on the arguments passed to
|
||||
the constructor and to this function.
|
||||
"""
|
||||
if not(num_validations and isinstance(num_validations, int)
|
||||
and num_validations > 0):
|
||||
raise ValueError('The "num_validations" attribute must be integer '
|
||||
'value > 1.')
|
||||
|
||||
if not name:
|
||||
name = test_utils.rand_name('validation-policy')
|
||||
if not num_validations:
|
||||
num_validations = 3
|
||||
|
||||
validations = [
|
||||
test_utils.rand_name('validation-name')
|
||||
for _ in range(num_validations)]
|
||||
|
||||
validation_policy_template = copy.deepcopy(
|
||||
self.VALIDATION_POLICY_TEMPLATE)
|
||||
validation_policy_template['metadata']['name'] = name
|
||||
validation_policy_template['data']['validations'] = validations
|
||||
|
||||
return validation_policy_template
|
0
deckhand/tests/functional/__init__.py
Normal file
0
deckhand/tests/functional/__init__.py
Normal file
37
deckhand/tests/functional/base.py
Normal file
37
deckhand/tests/functional/base.py
Normal file
@ -0,0 +1,37 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import mock
|
||||
|
||||
import falcon
|
||||
from falcon import testing as falcon_testing
|
||||
|
||||
from deckhand.control import api
|
||||
from deckhand import factories
|
||||
from deckhand.tests.unit import base as test_base
|
||||
|
||||
|
||||
class TestFunctionalBase(test_base.DeckhandWithDBTestCase,
|
||||
falcon_testing.TestCase):
|
||||
"""Base class for functional testing."""
|
||||
|
||||
def setUp(self):
|
||||
super(TestFunctionalBase, self).setUp()
|
||||
self.app = falcon_testing.TestClient(api.start_api())
|
||||
self.validation_policy_factory = factories.ValidationPolicyFactory()
|
||||
|
||||
@classmethod
|
||||
def setUpClass(cls):
|
||||
super(TestFunctionalBase, cls).setUpClass()
|
||||
mock.patch.object(api, '__setup_logging').start()
|
50
deckhand/tests/functional/test_documents.py
Normal file
50
deckhand/tests/functional/test_documents.py
Normal file
@ -0,0 +1,50 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import os
|
||||
import yaml
|
||||
|
||||
import falcon
|
||||
|
||||
from deckhand.control import api
|
||||
from deckhand.tests.functional import base as test_base
|
||||
from deckhand import types
|
||||
|
||||
|
||||
class TestDocumentsApi(test_base.TestFunctionalBase):
|
||||
|
||||
def _read_test_resource(self, file_name):
|
||||
dir_path = os.path.dirname(os.path.realpath(__file__))
|
||||
test_yaml_path = os.path.abspath(os.path.join(
|
||||
dir_path, os.pardir, 'unit', 'resources', file_name + '.yaml'))
|
||||
|
||||
with open(test_yaml_path, 'r') as yaml_file:
|
||||
yaml_data = yaml_file.read()
|
||||
return yaml_data
|
||||
|
||||
def test_create_document(self):
|
||||
yaml_data = self._read_test_resource('sample_document')
|
||||
result = self.app.simulate_post('/api/v1.0/documents', body=yaml_data)
|
||||
self.assertEqual(falcon.HTTP_201, result.status)
|
||||
|
||||
expected_documents = [yaml.safe_load(yaml_data)]
|
||||
expected_validation_policy = self.validation_policy_factory.gen(
|
||||
types.DECKHAND_SCHEMA_VALIDATION, status='success')
|
||||
|
||||
# Validate that the correct number of documents were created: one
|
||||
# document corresponding to ``yaml_data``.
|
||||
resp_documents = [d for d in yaml.safe_load_all(result.text)]
|
||||
self.assertIsInstance(resp_documents, list)
|
||||
self.assertEqual(1, len(resp_documents))
|
||||
self.assertIn('revision_id', resp_documents[0])
|
@ -14,21 +14,20 @@
|
||||
|
||||
import mock
|
||||
|
||||
import testtools
|
||||
|
||||
from deckhand.control import api
|
||||
from deckhand.control import base as api_base
|
||||
from deckhand.control import documents
|
||||
from deckhand.control import revision_documents
|
||||
from deckhand.control import revisions
|
||||
from deckhand.control import secrets
|
||||
from deckhand.tests.unit import base as test_base
|
||||
|
||||
|
||||
class TestApi(testtools.TestCase):
|
||||
class TestApi(test_base.DeckhandTestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(TestApi, self).setUp()
|
||||
for resource in (documents, revisions, revision_documents, secrets):
|
||||
for resource in (documents, revision_documents, revisions, secrets):
|
||||
resource_name = resource.__name__.split('.')[-1]
|
||||
resource_obj = mock.patch.object(
|
||||
resource, '%sResource' % resource_name.title().replace(
|
||||
|
@ -14,12 +14,11 @@
|
||||
|
||||
import mock
|
||||
|
||||
import testtools
|
||||
|
||||
from deckhand.control import base as api_base
|
||||
from deckhand.tests.unit import base as test_base
|
||||
|
||||
|
||||
class TestBaseResource(testtools.TestCase):
|
||||
class TestBaseResource(test_base.DeckhandTestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(TestBaseResource, self).setUp()
|
||||
|
@ -23,7 +23,7 @@ BASE_EXPECTED_FIELDS = ("created_at", "updated_at", "deleted_at", "deleted")
|
||||
DOCUMENT_EXPECTED_FIELDS = BASE_EXPECTED_FIELDS + (
|
||||
"id", "schema", "name", "metadata", "data", "revision_id")
|
||||
REVISION_EXPECTED_FIELDS = BASE_EXPECTED_FIELDS + (
|
||||
"id", "child_id", "parent_id", "documents")
|
||||
"id", "documents", "validation_policies")
|
||||
|
||||
|
||||
class DocumentFixture(object):
|
||||
@ -48,19 +48,24 @@ class DocumentFixture(object):
|
||||
|
||||
@staticmethod
|
||||
def get_minimal_multi_fixture(count=2, **kwargs):
|
||||
return [DocumentFixture.get_minimal_fixture(**kwargs)
|
||||
return [DocumentFixture.get_minimal_fixture(**kwargs)
|
||||
for _ in range(count)]
|
||||
|
||||
|
||||
class TestDbBase(base.DeckhandWithDBTestCase):
|
||||
|
||||
def _create_documents(self, payload):
|
||||
if not isinstance(payload, list):
|
||||
payload = [payload]
|
||||
def _create_documents(self, documents, validation_policies=None):
|
||||
if not validation_policies:
|
||||
validation_policies = []
|
||||
|
||||
docs = db_api.documents_create(payload)
|
||||
if not isinstance(documents, list):
|
||||
documents = [documents]
|
||||
if not isinstance(validation_policies, list):
|
||||
validation_policies = [validation_policies]
|
||||
|
||||
docs = db_api.documents_create(documents, validation_policies)
|
||||
for idx, doc in enumerate(docs):
|
||||
self._validate_document(expected=payload[idx], actual=doc)
|
||||
self._validate_document(expected=documents[idx], actual=doc)
|
||||
return docs
|
||||
|
||||
def _get_document(self, **fields):
|
||||
|
@ -24,9 +24,8 @@ class TestDocuments(base.TestDbBase):
|
||||
self.assertIsInstance(documents, list)
|
||||
self.assertEqual(1, len(documents))
|
||||
|
||||
for document in documents:
|
||||
retrieved_document = self._get_document(id=document['id'])
|
||||
self.assertEqual(document, retrieved_document)
|
||||
retrieved_document = self._get_document(id=documents[0]['id'])
|
||||
self.assertEqual(documents[0], retrieved_document)
|
||||
|
||||
def test_create_document_again_with_no_changes(self):
|
||||
payload = base.DocumentFixture.get_minimal_fixture()
|
||||
|
@ -13,16 +13,32 @@
|
||||
# limitations under the License.
|
||||
|
||||
from deckhand.tests.unit.db import base
|
||||
from deckhand import factories
|
||||
from deckhand import types
|
||||
|
||||
|
||||
class TestRevisionViews(base.TestDbBase):
|
||||
class TestRevisions(base.TestDbBase):
|
||||
|
||||
def test_list(self):
|
||||
payload = [base.DocumentFixture.get_minimal_fixture()
|
||||
for _ in range(4)]
|
||||
self._create_documents(payload)
|
||||
documents = [base.DocumentFixture.get_minimal_fixture()
|
||||
for _ in range(4)]
|
||||
self._create_documents(documents)
|
||||
|
||||
revisions = self._list_revisions()
|
||||
self.assertIsInstance(revisions, list)
|
||||
self.assertEqual(1, len(revisions))
|
||||
self.assertEqual(4, len(revisions[0]['documents']))
|
||||
|
||||
def test_list_with_validation_policies(self):
|
||||
documents = [base.DocumentFixture.get_minimal_fixture()
|
||||
for _ in range(4)]
|
||||
vp_factory = factories.ValidationPolicyFactory()
|
||||
validation_policy = vp_factory.gen(types.DECKHAND_SCHEMA_VALIDATION,
|
||||
'success')
|
||||
self._create_documents(documents, [validation_policy])
|
||||
|
||||
revisions = self._list_revisions()
|
||||
self.assertIsInstance(revisions, list)
|
||||
self.assertEqual(1, len(revisions))
|
||||
self.assertEqual(4, len(revisions[0]['documents']))
|
||||
self.assertEqual(1, len(revisions[0]['validation_policies']))
|
||||
|
89
deckhand/tests/unit/engine/base.py
Normal file
89
deckhand/tests/unit/engine/base.py
Normal file
@ -0,0 +1,89 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import copy
|
||||
import os
|
||||
import yaml
|
||||
|
||||
import mock
|
||||
import six
|
||||
|
||||
from deckhand.engine import document_validation
|
||||
from deckhand import errors
|
||||
from deckhand.tests.unit import base as test_base
|
||||
|
||||
|
||||
class TestDocumentValidationBase(test_base.DeckhandTestCase):
|
||||
|
||||
def _read_data(self, file_name):
|
||||
dir_path = os.path.dirname(os.path.realpath(__file__))
|
||||
test_yaml_path = os.path.abspath(os.path.join(
|
||||
dir_path, os.pardir, 'resources', file_name + '.yaml'))
|
||||
|
||||
with open(test_yaml_path, 'r') as yaml_file:
|
||||
yaml_data = yaml_file.read()
|
||||
self.data = yaml.safe_load(yaml_data)
|
||||
|
||||
def _corrupt_data(self, key, value=None, data=None, op='delete'):
|
||||
"""Corrupt test data to check that pre-validation works.
|
||||
|
||||
Corrupt data by removing a key from the document (if ``op`` is delete)
|
||||
or by replacing the value corresponding to the key with ``value`` (if
|
||||
``op`` is replace).
|
||||
|
||||
:param key: The document key to be removed. The key can have the
|
||||
following formats:
|
||||
* 'data' => document.pop('data')
|
||||
* 'metadata.name' => document['metadata'].pop('name')
|
||||
* 'metadata.substitutions.0.dest' =>
|
||||
document['metadata']['substitutions'][0].pop('dest')
|
||||
:type key: string
|
||||
:param value: The new value that corresponds to the (nested) document
|
||||
key (only used if ``op`` is 'replace').
|
||||
:type value: type string
|
||||
:param data: The data to "corrupt".
|
||||
:type data: dict
|
||||
:param op: Controls whether data is deleted (if "delete") or is
|
||||
replaced with ``value`` (if "replace").
|
||||
:type op: string
|
||||
:returns: Corrupted data.
|
||||
"""
|
||||
if data is None:
|
||||
data = self.data
|
||||
if op not in ('delete', 'replace'):
|
||||
raise ValueError("The ``op`` argument must either be 'delete' or "
|
||||
"'replace'.")
|
||||
corrupted_data = copy.deepcopy(data)
|
||||
|
||||
if '.' in key:
|
||||
_corrupted_data = corrupted_data
|
||||
nested_keys = key.split('.')
|
||||
for nested_key in nested_keys:
|
||||
if nested_key == nested_keys[-1]:
|
||||
break
|
||||
if nested_key.isdigit():
|
||||
_corrupted_data = _corrupted_data[int(nested_key)]
|
||||
else:
|
||||
_corrupted_data = _corrupted_data[nested_key]
|
||||
if op == 'delete':
|
||||
_corrupted_data.pop(nested_keys[-1])
|
||||
elif op == 'replace':
|
||||
_corrupted_data[nested_keys[-1]] = value
|
||||
else:
|
||||
if op == 'delete':
|
||||
corrupted_data.pop(key)
|
||||
elif op == 'replace':
|
||||
corrupted_data[key] = value
|
||||
|
||||
return corrupted_data
|
@ -12,112 +12,54 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import copy
|
||||
import os
|
||||
import testtools
|
||||
import yaml
|
||||
|
||||
import mock
|
||||
import six
|
||||
|
||||
from deckhand.engine import document_validation
|
||||
from deckhand import errors
|
||||
from deckhand.tests.unit.engine import base as engine_test_base
|
||||
|
||||
|
||||
class TestDocumentValidation(testtools.TestCase):
|
||||
class TestDocumentValidation(engine_test_base.TestDocumentValidationBase):
|
||||
|
||||
def setUp(self):
|
||||
super(TestDocumentValidation, self).setUp()
|
||||
dir_path = os.path.dirname(os.path.realpath(__file__))
|
||||
test_yaml_path = os.path.abspath(os.path.join(
|
||||
dir_path, os.pardir, 'resources', 'sample.yaml'))
|
||||
def test_init_document_validation(self):
|
||||
self._read_data('sample_document')
|
||||
doc_validation = document_validation.DocumentValidation(
|
||||
self.data)
|
||||
self.assertIsInstance(doc_validation,
|
||||
document_validation.DocumentValidation)
|
||||
|
||||
with open(test_yaml_path, 'r') as yaml_file:
|
||||
yaml_data = yaml_file.read()
|
||||
self.data = yaml.safe_load(yaml_data)
|
||||
|
||||
def _corrupt_data(self, key, data=None):
|
||||
"""Corrupt test data to check that pre-validation works.
|
||||
|
||||
Corrupt data by removing a key from the document. Each key must
|
||||
correspond to a value that is a dictionary.
|
||||
|
||||
:param key: The document key to be removed. The key can have the
|
||||
following formats:
|
||||
* 'data' => document.pop('data')
|
||||
* 'metadata.name' => document['metadata'].pop('name')
|
||||
* 'metadata.substitutions.0.dest' =>
|
||||
document['metadata']['substitutions'][0].pop('dest')
|
||||
:returns: Corrupted data.
|
||||
"""
|
||||
if data is None:
|
||||
data = self.data
|
||||
corrupted_data = copy.deepcopy(data)
|
||||
|
||||
if '.' in key:
|
||||
_corrupted_data = corrupted_data
|
||||
nested_keys = key.split('.')
|
||||
for nested_key in nested_keys:
|
||||
if nested_key == nested_keys[-1]:
|
||||
break
|
||||
if nested_key.isdigit():
|
||||
_corrupted_data = _corrupted_data[int(nested_key)]
|
||||
else:
|
||||
_corrupted_data = _corrupted_data[nested_key]
|
||||
_corrupted_data.pop(nested_keys[-1])
|
||||
else:
|
||||
corrupted_data.pop(key)
|
||||
|
||||
return corrupted_data
|
||||
|
||||
def test_initialization(self):
|
||||
doc_validation = document_validation.DocumentValidation(self.data)
|
||||
doc_validation.pre_validate() # Should not raise any errors.
|
||||
|
||||
def test_initialization_missing_sections(self):
|
||||
expected_err = ("The provided YAML file is invalid. Exception: '%s' "
|
||||
"is a required property.")
|
||||
invalid_data = [
|
||||
(self._corrupt_data('data'), 'data'),
|
||||
(self._corrupt_data('metadata.schema'), 'schema'),
|
||||
(self._corrupt_data('metadata.name'), 'name'),
|
||||
(self._corrupt_data('metadata.substitutions'), 'substitutions'),
|
||||
(self._corrupt_data('metadata.substitutions.0.dest'), 'dest'),
|
||||
(self._corrupt_data('metadata.substitutions.0.src'), 'src')
|
||||
def test_data_schema_missing_optional_sections(self):
|
||||
self._read_data('sample_data_schema')
|
||||
optional_missing_data = [
|
||||
self._corrupt_data('metadata.labels'),
|
||||
]
|
||||
|
||||
for invalid_entry, missing_key in invalid_data:
|
||||
with six.assertRaisesRegex(self, errors.InvalidFormat,
|
||||
expected_err % missing_key):
|
||||
doc_validation = document_validation.DocumentValidation(
|
||||
invalid_entry)
|
||||
doc_validation.pre_validate()
|
||||
for missing_data in optional_missing_data:
|
||||
document_validation.DocumentValidation(missing_data).validate_all()
|
||||
|
||||
def test_initialization_missing_abstract_section(self):
|
||||
expected_err = ("Could not find 'abstract' property from document.")
|
||||
invalid_data = [
|
||||
self._corrupt_data('metadata'),
|
||||
self._corrupt_data('metadata.layeringDefinition'),
|
||||
self._corrupt_data('metadata.layeringDefinition.abstract'),
|
||||
]
|
||||
def test_document_missing_optional_sections(self):
|
||||
self._read_data('sample_document')
|
||||
properties_to_remove = (
|
||||
'metadata.layeringDefinition.actions',
|
||||
'metadata.layeringDefinition.parentSelector',
|
||||
'metadata.substitutions',
|
||||
'metadata.substitutions.2.dest.pattern')
|
||||
|
||||
for invalid_entry in invalid_data:
|
||||
with six.assertRaisesRegex(self, errors.InvalidFormat,
|
||||
expected_err):
|
||||
doc_validation = document_validation.DocumentValidation(
|
||||
invalid_entry)
|
||||
doc_validation.pre_validate()
|
||||
for property_to_remove in properties_to_remove:
|
||||
optional_data_removed = self._corrupt_data(property_to_remove)
|
||||
document_validation.DocumentValidation(
|
||||
optional_data_removed).validate_all()
|
||||
|
||||
@mock.patch.object(document_validation, 'LOG', autospec=True)
|
||||
def test_initialization_with_abstract_document(self, mock_log):
|
||||
abstract_data = copy.deepcopy(self.data)
|
||||
def test_abstract_document_not_validated(self, mock_log):
|
||||
self._read_data('sample_document')
|
||||
# Set the document to abstract.
|
||||
updated_data = self._corrupt_data(
|
||||
'metadata.layeringDefinition.abstract', True, op='replace')
|
||||
# Guarantee that a validation error is thrown by removing a required
|
||||
# property.
|
||||
del updated_data['metadata']['layeringDefinition']['layer']
|
||||
|
||||
for true_val in (True, 'true', 'True'):
|
||||
abstract_data['metadata']['layeringDefinition']['abstract'] = True
|
||||
|
||||
doc_validation = document_validation.DocumentValidation(
|
||||
abstract_data)
|
||||
doc_validation.pre_validate()
|
||||
mock_log.info.assert_called_once_with(
|
||||
"Skipping validation for the document because it is abstract")
|
||||
mock_log.info.reset_mock()
|
||||
document_validation.DocumentValidation(updated_data).validate_all()
|
||||
self.assertTrue(mock_log.info.called)
|
||||
self.assertIn("Skipping schema validation for abstract document",
|
||||
mock_log.info.mock_calls[0][1][0])
|
||||
|
115
deckhand/tests/unit/engine/test_document_validation_negative.py
Normal file
115
deckhand/tests/unit/engine/test_document_validation_negative.py
Normal file
@ -0,0 +1,115 @@
|
||||
# Copyright 2017 AT&T Intellectual Property. All other rights reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
from deckhand.engine import document_validation
|
||||
from deckhand import errors
|
||||
from deckhand.tests.unit.engine import base as engine_test_base
|
||||
|
||||
|
||||
class TestDocumentValidationNegative(
|
||||
engine_test_base.TestDocumentValidationBase):
|
||||
"""Negative testing suite for document validation."""
|
||||
|
||||
BASIC_ATTRS = (
|
||||
'schema', 'metadata', 'data', 'metadata.schema', 'metadata.name')
|
||||
SCHEMA_ERR = ("The provided YAML failed schema validation. "
|
||||
"Details: '%s' is a required property.")
|
||||
SCHEMA_ERR_ALT = ("The provided %s YAML failed schema validation. "
|
||||
"Details: '%s' is a required property.")
|
||||
|
||||
def _test_missing_required_sections(self, properties_to_remove):
|
||||
for idx, property_to_remove in enumerate(properties_to_remove):
|
||||
missing_prop = property_to_remove.split('.')[-1]
|
||||
invalid_data = self._corrupt_data(property_to_remove)
|
||||
|
||||
if property_to_remove in self.BASIC_ATTRS:
|
||||
expected_err = self.SCHEMA_ERR % missing_prop
|
||||
else:
|
||||
expected_err = self.SCHEMA_ERR_ALT % (
|
||||
self.data['schema'], missing_prop)
|
||||
|
||||
# NOTE(fmontei): '$' must be escaped for regex to pass.
|
||||
expected_err = expected_err.replace('$', '\$')
|
||||
|
||||
with self.assertRaisesRegex(errors.InvalidDocumentFormat,
|
||||
expected_err):
|
||||
document_validation.DocumentValidation(
|
||||
invalid_data).validate_all()
|
||||
|
||||
def test_certificate_key_missing_required_sections(self):
|
||||
self._read_data('sample_certificate_key')
|
||||
properties_to_remove = self.BASIC_ATTRS + ('metadata.storagePolicy',)
|
||||
self._test_missing_required_sections(properties_to_remove)
|
||||
|
||||
def test_certificate_missing_required_sections(self):
|
||||
self._read_data('sample_certificate')
|
||||
properties_to_remove = self.BASIC_ATTRS + ('metadata.storagePolicy',)
|
||||
self._test_missing_required_sections(properties_to_remove)
|
||||
|
||||
def test_data_schema_missing_required_sections(self):
|
||||
self._read_data('sample_data_schema')
|
||||
properties_to_remove = self.BASIC_ATTRS + ('data.$schema',)
|
||||
self._test_missing_required_sections(properties_to_remove)
|
||||
|
||||
def test_document_missing_required_sections(self):
|
||||
self._read_data('sample_document')
|
||||
properties_to_remove = self.BASIC_ATTRS + (
|
||||
'metadata.layeringDefinition',
|
||||
'metadata.layeringDefinition.abstract',
|
||||
'metadata.layeringDefinition.layer',
|
||||
'metadata.layeringDefinition.actions.0.method',
|
||||
'metadata.layeringDefinition.actions.0.path',
|
||||
'metadata.substitutions.0.dest',
|
||||
'metadata.substitutions.0.dest.path',
|
||||
'metadata.substitutions.0.src',
|
||||
'metadata.substitutions.0.src.schema',
|
||||
'metadata.substitutions.0.src.name',
|
||||
'metadata.substitutions.0.src.path')
|
||||
self._test_missing_required_sections(properties_to_remove)
|
||||
|
||||
def test_document_invalid_layering_definition_action(self):
|
||||
self._read_data('sample_document')
|
||||
updated_data = self._corrupt_data(
|
||||
'metadata.layeringDefinition.actions.0.action', 'invalid',
|
||||
op='replace')
|
||||
self._test_missing_required_sections(updated_data)
|
||||
|
||||
def test_layering_policy_missing_required_sections(self):
|
||||
self._read_data('sample_layering_policy')
|
||||
properties_to_remove = self.BASIC_ATTRS + ('data.layerOrder',)
|
||||
self._test_missing_required_sections(properties_to_remove)
|
||||
|
||||
def test_passphrase_missing_required_sections(self):
|
||||
self._read_data('sample_passphrase')
|
||||
properties_to_remove = self.BASIC_ATTRS + ('metadata.storagePolicy',)
|
||||
self._test_missing_required_sections(properties_to_remove)
|
||||
|
||||
def test_passphrase_with_incorrect_storage_policy(self):
|
||||
self._read_data('sample_passphrase')
|
||||
expected_err = (
|
||||
"The provided deckhand/Passphrase/v1.0 YAML failed schema "
|
||||
"validation. Details: 'cleartext' does not match '^(encrypted)$'")
|
||||
wrong_data = self._corrupt_data('metadata.storagePolicy', 'cleartext',
|
||||
op='replace')
|
||||
|
||||
doc_validation = document_validation.DocumentValidation(wrong_data)
|
||||
e = self.assertRaises(errors.InvalidDocumentFormat,
|
||||
doc_validation.validate_all)
|
||||
self.assertIn(expected_err, str(e))
|
||||
|
||||
def test_validation_policy_missing_required_sections(self):
|
||||
self._read_data('sample_validation_policy')
|
||||
properties_to_remove = self.BASIC_ATTRS + (
|
||||
'data.validations', 'data.validations.0.name')
|
||||
self._test_missing_required_sections(properties_to_remove)
|
@ -1,38 +0,0 @@
|
||||
---
|
||||
schema: some-service/ResourceType/v1
|
||||
metadata:
|
||||
schema: metadata/Document/v1
|
||||
name: unique-name-given-schema
|
||||
storagePolicy: cleartext
|
||||
labels:
|
||||
genesis: enabled
|
||||
master: enabled
|
||||
layeringDefinition:
|
||||
abstract: false
|
||||
layer: region
|
||||
parentSelector:
|
||||
required_key_a: required_label_a
|
||||
required_key_b: required_label_b
|
||||
actions:
|
||||
- method: merge
|
||||
path: .path.to.merge.into.parent
|
||||
- method: delete
|
||||
path: .path.to.delete
|
||||
substitutions:
|
||||
- dest:
|
||||
path: .substitution.target
|
||||
src:
|
||||
schema: another-service/SourceType/v1
|
||||
name: name-of-source-document
|
||||
path: .source.path
|
||||
data:
|
||||
path:
|
||||
to:
|
||||
merge:
|
||||
into:
|
||||
parent:
|
||||
foo: bar
|
||||
ignored: # Will not be part of the resultant document after layering.
|
||||
data: here
|
||||
substitution:
|
||||
target: null # Paths do not need to exist to be specified as substitution destinations.
|
13
deckhand/tests/unit/resources/sample_certificate.yaml
Normal file
13
deckhand/tests/unit/resources/sample_certificate.yaml
Normal file
@ -0,0 +1,13 @@
|
||||
---
|
||||
schema: deckhand/Certificate/v1.0
|
||||
metadata:
|
||||
schema: metadata/Document/v1.0
|
||||
name: application-api
|
||||
storagePolicy: cleartext
|
||||
data: |-
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIDYDCCAkigAwIBAgIUKG41PW4VtiphzASAMY4/3hL8OtAwDQYJKoZIhvcNAQEL
|
||||
...snip...
|
||||
P3WT9CfFARnsw2nKjnglQcwKkKLYip0WY2wh3FE7nrQZP6xKNaSRlh6p2pCGwwwH
|
||||
HkvVwA==
|
||||
-----END CERTIFICATE-----
|
12
deckhand/tests/unit/resources/sample_certificate_key.yaml
Normal file
12
deckhand/tests/unit/resources/sample_certificate_key.yaml
Normal file
@ -0,0 +1,12 @@
|
||||
---
|
||||
schema: deckhand/CertificateKey/v1.0
|
||||
metadata:
|
||||
schema: metadata/Document/v1.0
|
||||
name: application-api
|
||||
storagePolicy: encrypted
|
||||
data: |-
|
||||
-----BEGIN RSA PRIVATE KEY-----
|
||||
MIIEpQIBAAKCAQEAx+m1+ao7uTVEs+I/Sie9YsXL0B9mOXFlzEdHX8P8x4nx78/T
|
||||
...snip...
|
||||
Zf3ykIG8l71pIs4TGsPlnyeO6LzCWP5WRSh+BHnyXXjzx/uxMOpQ/6I=
|
||||
-----END RSA PRIVATE KEY-----
|
9
deckhand/tests/unit/resources/sample_data_schema.yaml
Normal file
9
deckhand/tests/unit/resources/sample_data_schema.yaml
Normal file
@ -0,0 +1,9 @@
|
||||
---
|
||||
schema: deckhand/DataSchema/v1.0 # This specifies the official JSON schema meta-schema.
|
||||
metadata:
|
||||
schema: metadata/Control/v1.0
|
||||
name: promenade/Node/v1.0 # Specifies the documents to be used for validation.
|
||||
labels:
|
||||
application: promenade
|
||||
data: # Valid JSON Schema is expected here.
|
||||
$schema: http://blah
|
46
deckhand/tests/unit/resources/sample_document.yaml
Normal file
46
deckhand/tests/unit/resources/sample_document.yaml
Normal file
@ -0,0 +1,46 @@
|
||||
# Sample YAML file for testing forward replacement.
|
||||
---
|
||||
schema: promenade/ResourceType/v1.0
|
||||
metadata:
|
||||
schema: metadata/Document/v1.0
|
||||
name: a-unique-config-name-12345
|
||||
labels:
|
||||
component: apiserver
|
||||
hostname: server0
|
||||
layeringDefinition:
|
||||
layer: global
|
||||
abstract: False
|
||||
parentSelector:
|
||||
required_key_a: required_label_a
|
||||
required_key_b: required_label_b
|
||||
actions:
|
||||
- method: merge
|
||||
path: .path.to.merge.into.parent
|
||||
- method: delete
|
||||
path: .path.to.delete
|
||||
substitutions:
|
||||
- dest:
|
||||
path: .chart.values.tls.certificate
|
||||
src:
|
||||
schema: deckhand/Certificate/v1.0
|
||||
name: example-cert
|
||||
path: .
|
||||
- dest:
|
||||
path: .chart.values.tls.key
|
||||
src:
|
||||
schema: deckhand/CertificateKey/v1.0
|
||||
name: example-key
|
||||
path: .
|
||||
- dest:
|
||||
path: .chart.values.some_url
|
||||
pattern: INSERT_[A-Z]+_HERE
|
||||
src:
|
||||
schema: deckhand/Passphrase/v1.0
|
||||
name: example-password
|
||||
path: .
|
||||
data:
|
||||
chart:
|
||||
details:
|
||||
data: here
|
||||
values:
|
||||
some_url: http://admin:INSERT_PASSWORD_HERE@service-name:8080/v1
|
13
deckhand/tests/unit/resources/sample_layering_policy.yaml
Normal file
13
deckhand/tests/unit/resources/sample_layering_policy.yaml
Normal file
@ -0,0 +1,13 @@
|
||||
# Sample layering policy.
|
||||
---
|
||||
schema: deckhand/LayeringPolicy/v1.0
|
||||
metadata:
|
||||
schema: metadata/Control/v1
|
||||
name: a-unique-config-name-12345
|
||||
data:
|
||||
layerOrder:
|
||||
- global
|
||||
- global-network
|
||||
- global-storage
|
||||
- region
|
||||
- site
|
7
deckhand/tests/unit/resources/sample_passphrase.yaml
Normal file
7
deckhand/tests/unit/resources/sample_passphrase.yaml
Normal file
@ -0,0 +1,7 @@
|
||||
---
|
||||
schema: deckhand/Passphrase/v1.0
|
||||
metadata:
|
||||
schema: metadata/Document/v1.0
|
||||
name: application-admin-password
|
||||
storagePolicy: encrypted
|
||||
data: some-password
|
14
deckhand/tests/unit/resources/sample_validation_policy.yaml
Normal file
14
deckhand/tests/unit/resources/sample_validation_policy.yaml
Normal file
@ -0,0 +1,14 @@
|
||||
# Sample post-validation policy document.
|
||||
---
|
||||
schema: deckhand/ValidationPolicy/v1.0
|
||||
metadata:
|
||||
schema: metadata/Control/v1.0
|
||||
name: later-validation
|
||||
data:
|
||||
validations:
|
||||
- name: deckhand-schema-validation
|
||||
- name: drydock-site-validation
|
||||
expiresAfter: P1W
|
||||
- name: promenade-site-validation
|
||||
expiresAfter: P1W
|
||||
- name: armada-deployability-validation
|
@ -13,8 +13,10 @@
|
||||
# limitations under the License.
|
||||
|
||||
from deckhand.control.views import revision
|
||||
from deckhand import factories
|
||||
from deckhand.tests.unit.db import base
|
||||
from deckhand.tests import test_utils
|
||||
from deckhand import types
|
||||
|
||||
|
||||
class TestRevisionViews(base.TestDbBase):
|
||||
@ -22,15 +24,16 @@ class TestRevisionViews(base.TestDbBase):
|
||||
def setUp(self):
|
||||
super(TestRevisionViews, self).setUp()
|
||||
self.view_builder = revision.ViewBuilder()
|
||||
self.factory = factories.ValidationPolicyFactory()
|
||||
|
||||
def test_list_revisions(self):
|
||||
def test_list_revisions_with_multiple_documents(self):
|
||||
payload = [base.DocumentFixture.get_minimal_fixture()
|
||||
for _ in range(4)]
|
||||
self._create_documents(payload)
|
||||
revisions = self._list_revisions()
|
||||
revisions_view = self.view_builder.list(revisions)
|
||||
|
||||
expected_attrs = ('next', 'prev', 'results', 'count')
|
||||
expected_attrs = ('results', 'count')
|
||||
for attr in expected_attrs:
|
||||
self.assertIn(attr, revisions_view)
|
||||
# Validate that only 1 revision was returned.
|
||||
@ -40,7 +43,7 @@ class TestRevisionViews(base.TestDbBase):
|
||||
self.assertIn('count', revisions_view['results'][0])
|
||||
self.assertEqual(4, revisions_view['results'][0]['count'])
|
||||
|
||||
def test_list_many_revisions(self):
|
||||
def test_list_multiple_revisions(self):
|
||||
docs_count = []
|
||||
for _ in range(3):
|
||||
doc_count = test_utils.rand_int(3, 9)
|
||||
@ -52,7 +55,7 @@ class TestRevisionViews(base.TestDbBase):
|
||||
revisions = self._list_revisions()
|
||||
revisions_view = self.view_builder.list(revisions)
|
||||
|
||||
expected_attrs = ('next', 'prev', 'results', 'count')
|
||||
expected_attrs = ('results', 'count')
|
||||
for attr in expected_attrs:
|
||||
self.assertIn(attr, revisions_view)
|
||||
# Validate that only 1 revision was returned.
|
||||
@ -69,10 +72,79 @@ class TestRevisionViews(base.TestDbBase):
|
||||
payload = [base.DocumentFixture.get_minimal_fixture()
|
||||
for _ in range(4)]
|
||||
documents = self._create_documents(payload)
|
||||
|
||||
# Validate that each document points to the same revision.
|
||||
revision_ids = set([d['revision_id'] for d in documents])
|
||||
self.assertEqual(1, len(revision_ids))
|
||||
|
||||
revision = self._get_revision(documents[0]['revision_id'])
|
||||
revision_view = self.view_builder.show(revision)
|
||||
|
||||
expected_attrs = ('id', 'url', 'createdAt', 'validationPolicies')
|
||||
expected_attrs = ('id', 'url', 'createdAt', 'validationPolicies',
|
||||
'status')
|
||||
for attr in expected_attrs:
|
||||
self.assertIn(attr, revision_view)
|
||||
|
||||
self.assertIsInstance(revision_view['validationPolicies'], list)
|
||||
self.assertEqual(revision_view['validationPolicies'], [])
|
||||
|
||||
def test_show_revision_successful_validation_policy(self):
|
||||
# Simulate 4 document payload with an internally generated validation
|
||||
# policy which executes 'deckhand-schema-validation'.
|
||||
payload = [base.DocumentFixture.get_minimal_fixture()
|
||||
for _ in range(4)]
|
||||
validation_policy = self.factory.gen(types.DECKHAND_SCHEMA_VALIDATION,
|
||||
status='success')
|
||||
payload.append(validation_policy)
|
||||
documents = self._create_documents(payload)
|
||||
|
||||
revision = self._get_revision(documents[0]['revision_id'])
|
||||
revision_view = self.view_builder.show(revision)
|
||||
|
||||
expected_attrs = ('id', 'url', 'createdAt', 'validationPolicies',
|
||||
'status')
|
||||
expected_validation_policies = [
|
||||
{'name': 'deckhand-schema-validation'}, 'status'
|
||||
]
|
||||
|
||||
for attr in expected_attrs:
|
||||
self.assertIn(attr, revision_view)
|
||||
|
||||
self.assertEqual('success', revision_view['status'])
|
||||
self.assertIsInstance(revision_view['validationPolicies'], list)
|
||||
self.assertEqual(1, len(revision_view['validationPolicies']))
|
||||
self.assertEqual(revision_view['validationPolicies'][0]['name'],
|
||||
'deckhand-schema-validation')
|
||||
self.assertEqual(revision_view['validationPolicies'][0]['status'],
|
||||
'success')
|
||||
|
||||
|
||||
def test_show_revision_failed_validation_policy(self):
|
||||
# Simulate 4 document payload with an internally generated validation
|
||||
# policy which executes 'deckhand-schema-validation'.
|
||||
payload = [base.DocumentFixture.get_minimal_fixture()
|
||||
for _ in range(4)]
|
||||
validation_policy = self.factory.gen(types.DECKHAND_SCHEMA_VALIDATION,
|
||||
status='failed')
|
||||
payload.append(validation_policy)
|
||||
documents = self._create_documents(payload)
|
||||
|
||||
revision = self._get_revision(documents[0]['revision_id'])
|
||||
revision_view = self.view_builder.show(revision)
|
||||
|
||||
expected_attrs = ('id', 'url', 'createdAt', 'validationPolicies',
|
||||
'status')
|
||||
expected_validation_policies = [
|
||||
{'name': 'deckhand-schema-validation'}, 'status'
|
||||
]
|
||||
|
||||
for attr in expected_attrs:
|
||||
self.assertIn(attr, revision_view)
|
||||
|
||||
self.assertEqual('failed', revision_view['status'])
|
||||
self.assertIsInstance(revision_view['validationPolicies'], list)
|
||||
self.assertEqual(1, len(revision_view['validationPolicies']))
|
||||
self.assertEqual(revision_view['validationPolicies'][0]['name'],
|
||||
'deckhand-schema-validation')
|
||||
self.assertEqual(revision_view['validationPolicies'][0]['status'],
|
||||
'failed')
|
||||
|
@ -11,3 +11,17 @@
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
DOCUMENT_SCHEMA_TYPES = (
|
||||
LAYERING_POLICY_SCHEMA,
|
||||
VALIDATION_POLICY_SCHEMA,
|
||||
) = (
|
||||
'deckhand/LayeringPolicy/v1',
|
||||
'deckhand/ValidationPolicy/v1',
|
||||
)
|
||||
|
||||
DECKHAND_VALIDATION_TYPES = (
|
||||
DECKHAND_SCHEMA_VALIDATION,
|
||||
) = (
|
||||
'deckhand-schema-validation',
|
||||
)
|
9
tox.ini
9
tox.ini
@ -29,6 +29,15 @@ commands =
|
||||
{[testenv]commands}
|
||||
ostestr '{posargs}'
|
||||
|
||||
[testenv:functional]
|
||||
usedevelop = True
|
||||
setenv = VIRTUAL_ENV={envdir}
|
||||
OS_TEST_PATH=./deckhand/tests/functional
|
||||
LANGUAGE=en_US
|
||||
commands =
|
||||
find . -type f -name "*.pyc" -delete
|
||||
ostestr '{posargs}'
|
||||
|
||||
[testenv:genconfig]
|
||||
commands = oslo-config-generator --config-file=etc/deckhand/config-generator.conf
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user