Make hacking a flake8 plugin.
This commit is contained in:
parent
db1d1959af
commit
4e90f08824
30
.gitignore
vendored
Normal file
30
.gitignore
vendored
Normal file
@ -0,0 +1,30 @@
|
|||||||
|
# Compiled files
|
||||||
|
*.py[co]
|
||||||
|
*.a
|
||||||
|
*.o
|
||||||
|
*.so
|
||||||
|
|
||||||
|
# Sphinx
|
||||||
|
_build
|
||||||
|
|
||||||
|
# Packages/installer info
|
||||||
|
*.egg
|
||||||
|
*.egg-info
|
||||||
|
dist
|
||||||
|
build
|
||||||
|
eggs
|
||||||
|
parts
|
||||||
|
bin
|
||||||
|
var
|
||||||
|
sdist
|
||||||
|
develop-eggs
|
||||||
|
.installed.cfg
|
||||||
|
|
||||||
|
# Other
|
||||||
|
.testrepository
|
||||||
|
.tox
|
||||||
|
.*.swp
|
||||||
|
.coverage
|
||||||
|
cover
|
||||||
|
AUTHORS
|
||||||
|
ChangeLog
|
4
.gitreview
Normal file
4
.gitreview
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
[gerrit]
|
||||||
|
host=review.openstack.org
|
||||||
|
port=29418
|
||||||
|
project=openstack-dev/hacking.git
|
4
.testr.conf
Normal file
4
.testr.conf
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
[DEFAULT]
|
||||||
|
test_command=OS_STDOUT_CAPTURE=1 OS_STDERR_CAPTURE=1 OS_TEST_TIMEOUT=60 ${PYTHON:-python} -m subunit.run discover -t ./ . $LISTOPT $IDOPTION
|
||||||
|
test_id_option=--load-list $IDFILE
|
||||||
|
test_list_option=--list
|
17
CONTRIBUTING.rst
Normal file
17
CONTRIBUTING.rst
Normal file
@ -0,0 +1,17 @@
|
|||||||
|
If you would like to contribute to the development of OpenStack,
|
||||||
|
you must follow the steps in the "If you're a developer, start here"
|
||||||
|
section of this page:
|
||||||
|
|
||||||
|
http://wiki.openstack.org/HowToContribute
|
||||||
|
|
||||||
|
Once those steps have been completed, changes to OpenStack
|
||||||
|
should be submitted for review via the Gerrit tool, following
|
||||||
|
the workflow documented at:
|
||||||
|
|
||||||
|
http://wiki.openstack.org/GerritWorkflow
|
||||||
|
|
||||||
|
Pull requests submitted through GitHub will be ignored.
|
||||||
|
|
||||||
|
Bugs should be filed on Launchpad, not GitHub:
|
||||||
|
|
||||||
|
https://bugs.launchpad.net/hacking
|
176
LICENSE
Normal file
176
LICENSE
Normal file
@ -0,0 +1,176 @@
|
|||||||
|
|
||||||
|
Apache License
|
||||||
|
Version 2.0, January 2004
|
||||||
|
http://www.apache.org/licenses/
|
||||||
|
|
||||||
|
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||||
|
|
||||||
|
1. Definitions.
|
||||||
|
|
||||||
|
"License" shall mean the terms and conditions for use, reproduction,
|
||||||
|
and distribution as defined by Sections 1 through 9 of this document.
|
||||||
|
|
||||||
|
"Licensor" shall mean the copyright owner or entity authorized by
|
||||||
|
the copyright owner that is granting the License.
|
||||||
|
|
||||||
|
"Legal Entity" shall mean the union of the acting entity and all
|
||||||
|
other entities that control, are controlled by, or are under common
|
||||||
|
control with that entity. For the purposes of this definition,
|
||||||
|
"control" means (i) the power, direct or indirect, to cause the
|
||||||
|
direction or management of such entity, whether by contract or
|
||||||
|
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||||
|
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||||
|
|
||||||
|
"You" (or "Your") shall mean an individual or Legal Entity
|
||||||
|
exercising permissions granted by this License.
|
||||||
|
|
||||||
|
"Source" form shall mean the preferred form for making modifications,
|
||||||
|
including but not limited to software source code, documentation
|
||||||
|
source, and configuration files.
|
||||||
|
|
||||||
|
"Object" form shall mean any form resulting from mechanical
|
||||||
|
transformation or translation of a Source form, including but
|
||||||
|
not limited to compiled object code, generated documentation,
|
||||||
|
and conversions to other media types.
|
||||||
|
|
||||||
|
"Work" shall mean the work of authorship, whether in Source or
|
||||||
|
Object form, made available under the License, as indicated by a
|
||||||
|
copyright notice that is included in or attached to the work
|
||||||
|
(an example is provided in the Appendix below).
|
||||||
|
|
||||||
|
"Derivative Works" shall mean any work, whether in Source or Object
|
||||||
|
form, that is based on (or derived from) the Work and for which the
|
||||||
|
editorial revisions, annotations, elaborations, or other modifications
|
||||||
|
represent, as a whole, an original work of authorship. For the purposes
|
||||||
|
of this License, Derivative Works shall not include works that remain
|
||||||
|
separable from, or merely link (or bind by name) to the interfaces of,
|
||||||
|
the Work and Derivative Works thereof.
|
||||||
|
|
||||||
|
"Contribution" shall mean any work of authorship, including
|
||||||
|
the original version of the Work and any modifications or additions
|
||||||
|
to that Work or Derivative Works thereof, that is intentionally
|
||||||
|
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||||
|
or by an individual or Legal Entity authorized to submit on behalf of
|
||||||
|
the copyright owner. For the purposes of this definition, "submitted"
|
||||||
|
means any form of electronic, verbal, or written communication sent
|
||||||
|
to the Licensor or its representatives, including but not limited to
|
||||||
|
communication on electronic mailing lists, source code control systems,
|
||||||
|
and issue tracking systems that are managed by, or on behalf of, the
|
||||||
|
Licensor for the purpose of discussing and improving the Work, but
|
||||||
|
excluding communication that is conspicuously marked or otherwise
|
||||||
|
designated in writing by the copyright owner as "Not a Contribution."
|
||||||
|
|
||||||
|
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||||
|
on behalf of whom a Contribution has been received by Licensor and
|
||||||
|
subsequently incorporated within the Work.
|
||||||
|
|
||||||
|
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||||
|
this License, each Contributor hereby grants to You a perpetual,
|
||||||
|
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||||
|
copyright license to reproduce, prepare Derivative Works of,
|
||||||
|
publicly display, publicly perform, sublicense, and distribute the
|
||||||
|
Work and such Derivative Works in Source or Object form.
|
||||||
|
|
||||||
|
3. Grant of Patent License. Subject to the terms and conditions of
|
||||||
|
this License, each Contributor hereby grants to You a perpetual,
|
||||||
|
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||||
|
(except as stated in this section) patent license to make, have made,
|
||||||
|
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||||
|
where such license applies only to those patent claims licensable
|
||||||
|
by such Contributor that are necessarily infringed by their
|
||||||
|
Contribution(s) alone or by combination of their Contribution(s)
|
||||||
|
with the Work to which such Contribution(s) was submitted. If You
|
||||||
|
institute patent litigation against any entity (including a
|
||||||
|
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||||
|
or a Contribution incorporated within the Work constitutes direct
|
||||||
|
or contributory patent infringement, then any patent licenses
|
||||||
|
granted to You under this License for that Work shall terminate
|
||||||
|
as of the date such litigation is filed.
|
||||||
|
|
||||||
|
4. Redistribution. You may reproduce and distribute copies of the
|
||||||
|
Work or Derivative Works thereof in any medium, with or without
|
||||||
|
modifications, and in Source or Object form, provided that You
|
||||||
|
meet the following conditions:
|
||||||
|
|
||||||
|
(a) You must give any other recipients of the Work or
|
||||||
|
Derivative Works a copy of this License; and
|
||||||
|
|
||||||
|
(b) You must cause any modified files to carry prominent notices
|
||||||
|
stating that You changed the files; and
|
||||||
|
|
||||||
|
(c) You must retain, in the Source form of any Derivative Works
|
||||||
|
that You distribute, all copyright, patent, trademark, and
|
||||||
|
attribution notices from the Source form of the Work,
|
||||||
|
excluding those notices that do not pertain to any part of
|
||||||
|
the Derivative Works; and
|
||||||
|
|
||||||
|
(d) If the Work includes a "NOTICE" text file as part of its
|
||||||
|
distribution, then any Derivative Works that You distribute must
|
||||||
|
include a readable copy of the attribution notices contained
|
||||||
|
within such NOTICE file, excluding those notices that do not
|
||||||
|
pertain to any part of the Derivative Works, in at least one
|
||||||
|
of the following places: within a NOTICE text file distributed
|
||||||
|
as part of the Derivative Works; within the Source form or
|
||||||
|
documentation, if provided along with the Derivative Works; or,
|
||||||
|
within a display generated by the Derivative Works, if and
|
||||||
|
wherever such third-party notices normally appear. The contents
|
||||||
|
of the NOTICE file are for informational purposes only and
|
||||||
|
do not modify the License. You may add Your own attribution
|
||||||
|
notices within Derivative Works that You distribute, alongside
|
||||||
|
or as an addendum to the NOTICE text from the Work, provided
|
||||||
|
that such additional attribution notices cannot be construed
|
||||||
|
as modifying the License.
|
||||||
|
|
||||||
|
You may add Your own copyright statement to Your modifications and
|
||||||
|
may provide additional or different license terms and conditions
|
||||||
|
for use, reproduction, or distribution of Your modifications, or
|
||||||
|
for any such Derivative Works as a whole, provided Your use,
|
||||||
|
reproduction, and distribution of the Work otherwise complies with
|
||||||
|
the conditions stated in this License.
|
||||||
|
|
||||||
|
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||||
|
any Contribution intentionally submitted for inclusion in the Work
|
||||||
|
by You to the Licensor shall be under the terms and conditions of
|
||||||
|
this License, without any additional terms or conditions.
|
||||||
|
Notwithstanding the above, nothing herein shall supersede or modify
|
||||||
|
the terms of any separate license agreement you may have executed
|
||||||
|
with Licensor regarding such Contributions.
|
||||||
|
|
||||||
|
6. Trademarks. This License does not grant permission to use the trade
|
||||||
|
names, trademarks, service marks, or product names of the Licensor,
|
||||||
|
except as required for reasonable and customary use in describing the
|
||||||
|
origin of the Work and reproducing the content of the NOTICE file.
|
||||||
|
|
||||||
|
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||||
|
agreed to in writing, Licensor provides the Work (and each
|
||||||
|
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||||
|
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||||
|
implied, including, without limitation, any warranties or conditions
|
||||||
|
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||||
|
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||||
|
appropriateness of using or redistributing the Work and assume any
|
||||||
|
risks associated with Your exercise of permissions under this License.
|
||||||
|
|
||||||
|
8. Limitation of Liability. In no event and under no legal theory,
|
||||||
|
whether in tort (including negligence), contract, or otherwise,
|
||||||
|
unless required by applicable law (such as deliberate and grossly
|
||||||
|
negligent acts) or agreed to in writing, shall any Contributor be
|
||||||
|
liable to You for damages, including any direct, indirect, special,
|
||||||
|
incidental, or consequential damages of any character arising as a
|
||||||
|
result of this License or out of the use or inability to use the
|
||||||
|
Work (including but not limited to damages for loss of goodwill,
|
||||||
|
work stoppage, computer failure or malfunction, or any and all
|
||||||
|
other commercial damages or losses), even if such Contributor
|
||||||
|
has been advised of the possibility of such damages.
|
||||||
|
|
||||||
|
9. Accepting Warranty or Additional Liability. While redistributing
|
||||||
|
the Work or Derivative Works thereof, You may choose to offer,
|
||||||
|
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||||
|
or other liability obligations and/or rights consistent with this
|
||||||
|
License. However, in accepting such obligations, You may act only
|
||||||
|
on Your own behalf and on Your sole responsibility, not on behalf
|
||||||
|
of any other Contributor, and only if You agree to indemnify,
|
||||||
|
defend, and hold each Contributor harmless for any liability
|
||||||
|
incurred by, or claims asserted against, such Contributor by reason
|
||||||
|
of your accepting any such warranty or additional liability.
|
||||||
|
|
7
MANIFEST.in
Normal file
7
MANIFEST.in
Normal file
@ -0,0 +1,7 @@
|
|||||||
|
include AUTHORS
|
||||||
|
include ChangeLog
|
||||||
|
include README.rst
|
||||||
|
exclude .gitignore
|
||||||
|
exclude .gitreview
|
||||||
|
|
||||||
|
global-exclude *.pyc
|
304
README.rst
Normal file
304
README.rst
Normal file
@ -0,0 +1,304 @@
|
|||||||
|
Introduction
|
||||||
|
============
|
||||||
|
|
||||||
|
hacking is a set of flake8 plugins that test and enforce:
|
||||||
|
|
||||||
|
OpenStack Style Commandments
|
||||||
|
============================
|
||||||
|
|
||||||
|
- Step 1: Read http://www.python.org/dev/peps/pep-0008/
|
||||||
|
- Step 2: Read http://www.python.org/dev/peps/pep-0008/ again
|
||||||
|
- Step 3: Read on
|
||||||
|
|
||||||
|
|
||||||
|
General
|
||||||
|
-------
|
||||||
|
- Put two newlines between top-level code (funcs, classes, etc)
|
||||||
|
- Use only UNIX style newlines ("\n"), not Windows style ("\r\n")
|
||||||
|
- Put one newline between methods in classes and anywhere else
|
||||||
|
- Long lines should be wrapped in parentheses
|
||||||
|
in preference to using a backslash for line continuation.
|
||||||
|
- Do not write "except:", use "except Exception:" at the very least
|
||||||
|
- Include your name with TODOs as in "#TODO(termie)"
|
||||||
|
- Do not shadow a built-in or reserved word. Example::
|
||||||
|
|
||||||
|
def list():
|
||||||
|
return [1, 2, 3]
|
||||||
|
|
||||||
|
mylist = list() # BAD, shadows `list` built-in
|
||||||
|
|
||||||
|
class Foo(object):
|
||||||
|
def list(self):
|
||||||
|
return [1, 2, 3]
|
||||||
|
|
||||||
|
mylist = Foo().list() # OKAY, does not shadow built-in
|
||||||
|
|
||||||
|
- Use the "is not" operator when testing for unequal identities. Example::
|
||||||
|
|
||||||
|
if not X is Y: # BAD, intended behavior is ambiguous
|
||||||
|
pass
|
||||||
|
|
||||||
|
if X is not Y: # OKAY, intuitive
|
||||||
|
pass
|
||||||
|
|
||||||
|
- Use the "not in" operator for evaluating membership in a collection. Example::
|
||||||
|
|
||||||
|
if not X in Y: # BAD, intended behavior is ambiguous
|
||||||
|
pass
|
||||||
|
|
||||||
|
if X not in Y: # OKAY, intuitive
|
||||||
|
pass
|
||||||
|
|
||||||
|
if not (X in Y or X in Z): # OKAY, still better than all those 'not's
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
Imports
|
||||||
|
-------
|
||||||
|
- Do not import objects, only modules (*)
|
||||||
|
- Do not import more than one module per line (*)
|
||||||
|
- Do not use wildcard ``*`` import (*)
|
||||||
|
- Do not make relative imports
|
||||||
|
- Do not make new nova.db imports in nova/virt/*
|
||||||
|
- Order your imports by the full module path
|
||||||
|
- Organize your imports according to the following template
|
||||||
|
|
||||||
|
(*) exceptions are:
|
||||||
|
|
||||||
|
- imports from ``migrate`` package
|
||||||
|
- imports from ``sqlalchemy`` package
|
||||||
|
- imports from ``nova.db.sqlalchemy.session`` module
|
||||||
|
- imports from ``nova.db.sqlalchemy.migration.versioning_api`` package
|
||||||
|
|
||||||
|
Example::
|
||||||
|
|
||||||
|
# vim: tabstop=4 shiftwidth=4 softtabstop=4
|
||||||
|
{{stdlib imports in human alphabetical order}}
|
||||||
|
\n
|
||||||
|
{{third-party lib imports in human alphabetical order}}
|
||||||
|
\n
|
||||||
|
{{nova imports in human alphabetical order}}
|
||||||
|
\n
|
||||||
|
\n
|
||||||
|
{{begin your code}}
|
||||||
|
|
||||||
|
|
||||||
|
Human Alphabetical Order Examples
|
||||||
|
---------------------------------
|
||||||
|
Example::
|
||||||
|
|
||||||
|
import httplib
|
||||||
|
import logging
|
||||||
|
import random
|
||||||
|
import StringIO
|
||||||
|
import time
|
||||||
|
import unittest
|
||||||
|
|
||||||
|
import eventlet
|
||||||
|
import webob.exc
|
||||||
|
|
||||||
|
import nova.api.ec2
|
||||||
|
from nova.api import openstack
|
||||||
|
from nova.auth import users
|
||||||
|
from nova.endpoint import cloud
|
||||||
|
import nova.flags
|
||||||
|
from nova import test
|
||||||
|
|
||||||
|
|
||||||
|
Docstrings
|
||||||
|
----------
|
||||||
|
Example::
|
||||||
|
|
||||||
|
"""A one line docstring looks like this and ends in a period."""
|
||||||
|
|
||||||
|
|
||||||
|
"""A multi line docstring has a one-line summary, less than 80 characters.
|
||||||
|
|
||||||
|
Then a new paragraph after a newline that explains in more detail any
|
||||||
|
general information about the function, class or method. Example usages
|
||||||
|
are also great to have here if it is a complex class for function.
|
||||||
|
|
||||||
|
When writing the docstring for a class, an extra line should be placed
|
||||||
|
after the closing quotations. For more in-depth explanations for these
|
||||||
|
decisions see http://www.python.org/dev/peps/pep-0257/
|
||||||
|
|
||||||
|
If you are going to describe parameters and return values, use Sphinx, the
|
||||||
|
appropriate syntax is as follows.
|
||||||
|
|
||||||
|
:param foo: the foo parameter
|
||||||
|
:param bar: the bar parameter
|
||||||
|
:returns: return_type -- description of the return value
|
||||||
|
:returns: description of the return value
|
||||||
|
:raises: AttributeError, KeyError
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
Dictionaries/Lists
|
||||||
|
------------------
|
||||||
|
If a dictionary (dict) or list object is longer than 80 characters, its items
|
||||||
|
should be split with newlines. Embedded iterables should have their items
|
||||||
|
indented. Additionally, the last item in the dictionary should have a trailing
|
||||||
|
comma. This increases readability and simplifies future diffs.
|
||||||
|
|
||||||
|
Example::
|
||||||
|
|
||||||
|
my_dictionary = {
|
||||||
|
"image": {
|
||||||
|
"name": "Just a Snapshot",
|
||||||
|
"size": 2749573,
|
||||||
|
"properties": {
|
||||||
|
"user_id": 12,
|
||||||
|
"arch": "x86_64",
|
||||||
|
},
|
||||||
|
"things": [
|
||||||
|
"thing_one",
|
||||||
|
"thing_two",
|
||||||
|
],
|
||||||
|
"status": "ACTIVE",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Calling Methods
|
||||||
|
---------------
|
||||||
|
Calls to methods 80 characters or longer should format each argument with
|
||||||
|
newlines. This is not a requirement, but a guideline::
|
||||||
|
|
||||||
|
unnecessarily_long_function_name('string one',
|
||||||
|
'string two',
|
||||||
|
kwarg1=constants.ACTIVE,
|
||||||
|
kwarg2=['a', 'b', 'c'])
|
||||||
|
|
||||||
|
|
||||||
|
Rather than constructing parameters inline, it is better to break things up::
|
||||||
|
|
||||||
|
list_of_strings = [
|
||||||
|
'what_a_long_string',
|
||||||
|
'not as long',
|
||||||
|
]
|
||||||
|
|
||||||
|
dict_of_numbers = {
|
||||||
|
'one': 1,
|
||||||
|
'two': 2,
|
||||||
|
'twenty four': 24,
|
||||||
|
}
|
||||||
|
|
||||||
|
object_one.call_a_method('string three',
|
||||||
|
'string four',
|
||||||
|
kwarg1=list_of_strings,
|
||||||
|
kwarg2=dict_of_numbers)
|
||||||
|
|
||||||
|
|
||||||
|
Internationalization (i18n) Strings
|
||||||
|
-----------------------------------
|
||||||
|
In order to support multiple languages, we have a mechanism to support
|
||||||
|
automatic translations of exception and log strings.
|
||||||
|
|
||||||
|
Example::
|
||||||
|
|
||||||
|
msg = _("An error occurred")
|
||||||
|
raise HTTPBadRequest(explanation=msg)
|
||||||
|
|
||||||
|
If you have a variable to place within the string, first internationalize the
|
||||||
|
template string then do the replacement.
|
||||||
|
|
||||||
|
Example::
|
||||||
|
|
||||||
|
msg = _("Missing parameter: %s") % ("flavor",)
|
||||||
|
LOG.error(msg)
|
||||||
|
|
||||||
|
If you have multiple variables to place in the string, use keyword parameters.
|
||||||
|
This helps our translators reorder parameters when needed.
|
||||||
|
|
||||||
|
Example::
|
||||||
|
|
||||||
|
msg = _("The server with id %(s_id)s has no key %(m_key)s")
|
||||||
|
LOG.error(msg % {"s_id": "1234", "m_key": "imageId"})
|
||||||
|
|
||||||
|
|
||||||
|
Creating Unit Tests
|
||||||
|
-------------------
|
||||||
|
For every new feature, unit tests should be created that both test and
|
||||||
|
(implicitly) document the usage of said feature. If submitting a patch for a
|
||||||
|
bug that had no unit test, a new passing unit test should be added. If a
|
||||||
|
submitted bug fix does have a unit test, be sure to add a new one that fails
|
||||||
|
without the patch and passes with the patch.
|
||||||
|
|
||||||
|
For more information on creating unit tests and utilizing the testing
|
||||||
|
infrastructure in OpenStack Nova, please read nova/tests/README.rst.
|
||||||
|
|
||||||
|
|
||||||
|
Running Tests
|
||||||
|
-------------
|
||||||
|
The testing system is based on a combination of tox and testr. The canonical
|
||||||
|
approach to running tests is to simply run the command `tox`. This will
|
||||||
|
create virtual environments, populate them with depenedencies and run all of
|
||||||
|
the tests that OpenStack CI systems run. Behind the scenes, tox is running
|
||||||
|
`testr run --parallel`, but is set up such that you can supply any additional
|
||||||
|
testr arguments that are needed to tox. For example, you can run:
|
||||||
|
`tox -- --analyze-isolation` to cause tox to tell testr to add
|
||||||
|
--analyze-isolation to its argument list.
|
||||||
|
|
||||||
|
It is also possible to run the tests inside of a virtual environment
|
||||||
|
you have created, or it is possible that you have all of the dependencies
|
||||||
|
installed locally already. In this case, you can interact with the testr
|
||||||
|
command directly. Running `testr run` will run the entire test suite. `testr
|
||||||
|
run --parallel` will run it in parallel (this is the default incantation tox
|
||||||
|
uses.) More information about testr can be found at:
|
||||||
|
http://wiki.openstack.org/testr
|
||||||
|
|
||||||
|
|
||||||
|
openstack-common
|
||||||
|
----------------
|
||||||
|
|
||||||
|
A number of modules from openstack-common are imported into the project.
|
||||||
|
|
||||||
|
These modules are "incubating" in openstack-common and are kept in sync
|
||||||
|
with the help of openstack-common's update.py script. See:
|
||||||
|
|
||||||
|
http://wiki.openstack.org/CommonLibrary#Incubation
|
||||||
|
|
||||||
|
The copy of the code should never be directly modified here. Please
|
||||||
|
always update openstack-common first and then run the script to copy
|
||||||
|
the changes across.
|
||||||
|
|
||||||
|
OpenStack Trademark
|
||||||
|
-------------------
|
||||||
|
|
||||||
|
OpenStack is a registered trademark of the OpenStack Foundation, and uses the
|
||||||
|
following capitalization:
|
||||||
|
|
||||||
|
OpenStack
|
||||||
|
|
||||||
|
|
||||||
|
Commit Messages
|
||||||
|
---------------
|
||||||
|
Using a common format for commit messages will help keep our git history
|
||||||
|
readable. Follow these guidelines:
|
||||||
|
|
||||||
|
First, provide a brief summary of 50 characters or less. Summaries
|
||||||
|
of greater then 72 characters will be rejected by the gate.
|
||||||
|
|
||||||
|
The first line of the commit message should provide an accurate
|
||||||
|
description of the change, not just a reference to a bug or
|
||||||
|
blueprint. It must be followed by a single blank line.
|
||||||
|
|
||||||
|
If the change relates to a specific driver (libvirt, xenapi, qpid, etc...),
|
||||||
|
begin the first line of the commit message with the driver name, lowercased,
|
||||||
|
followed by a colon.
|
||||||
|
|
||||||
|
Following your brief summary, provide a more detailed description of
|
||||||
|
the patch, manually wrapping the text at 72 characters. This
|
||||||
|
description should provide enough detail that one does not have to
|
||||||
|
refer to external resources to determine its high-level functionality.
|
||||||
|
|
||||||
|
Once you use 'git review', two lines will be appended to the commit
|
||||||
|
message: a blank line followed by a 'Change-Id'. This is important
|
||||||
|
to correlate this commit with a specific review in Gerrit, and it
|
||||||
|
should not be modified.
|
||||||
|
|
||||||
|
For further information on constructing high quality commit messages,
|
||||||
|
and how to split up commits into a series of changes, consult the
|
||||||
|
project wiki:
|
||||||
|
|
||||||
|
http://wiki.openstack.org/GitCommitMessages
|
25
clean-vlans
25
clean-vlans
@ -1,25 +0,0 @@
|
|||||||
#!/usr/bin/env bash
|
|
||||||
# vim: tabstop=4 shiftwidth=4 softtabstop=4
|
|
||||||
|
|
||||||
# Copyright 2010 United States Government as represented by the
|
|
||||||
# Administrator of the National Aeronautics and Space Administration.
|
|
||||||
# All Rights Reserved.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
export LC_ALL=C
|
|
||||||
|
|
||||||
sudo ifconfig -a | grep br | grep -v bridge | cut -f1 -d" " | xargs -n1 -ifoo ifconfig foo down
|
|
||||||
sudo ifconfig -a | grep br | grep -v bridge | cut -f1 -d" " | xargs -n1 -ifoo brctl delbr foo
|
|
||||||
sudo ifconfig -a | grep vlan | cut -f1 -d" " | xargs -n1 -ifoo ifconfig foo down
|
|
||||||
sudo ifconfig -a | grep vlan | cut -f1 -d" " | xargs -n1 -ifoo ip link del foo
|
|
20
conf/README
20
conf/README
@ -1,20 +0,0 @@
|
|||||||
This generate_sample.sh tool is used to generate etc/nova/nova.conf.sample
|
|
||||||
|
|
||||||
Run it from the top-level working directory i.e.
|
|
||||||
|
|
||||||
$> ./tools/conf/generate_sample.sh
|
|
||||||
|
|
||||||
Watch out for warnings about modules like libvirt, qpid and zmq not
|
|
||||||
being found - these warnings are significant because they result
|
|
||||||
in options not appearing in the generated config file.
|
|
||||||
|
|
||||||
|
|
||||||
The analyze_opts.py tool is used to find options which appear in
|
|
||||||
/etc/nova/nova.conf but not in etc/nova/nova.conf.sample
|
|
||||||
This helps identify options in the nova.conf file which are not used by nova.
|
|
||||||
The tool also identifies any options which are set to the default value.
|
|
||||||
|
|
||||||
Run it from the top-level working directory i.e.
|
|
||||||
|
|
||||||
$> ./tools/conf/analyze_opts.py
|
|
||||||
|
|
@ -1,80 +0,0 @@
|
|||||||
#!/usr/bin/env python
|
|
||||||
# vim: tabstop=4 shiftwidth=4 softtabstop=4
|
|
||||||
|
|
||||||
# Copyright (c) 2012, Cloudscaling
|
|
||||||
# All Rights Reserved.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
'''
|
|
||||||
find_unused_options.py
|
|
||||||
|
|
||||||
Compare the nova.conf file with the nova.conf.sample file to find any unused
|
|
||||||
options or default values in nova.conf
|
|
||||||
'''
|
|
||||||
import argparse
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
|
|
||||||
sys.path.append(os.getcwd())
|
|
||||||
from oslo.config import iniparser
|
|
||||||
|
|
||||||
|
|
||||||
class PropertyCollecter(iniparser.BaseParser):
|
|
||||||
def __init__(self):
|
|
||||||
super(PropertyCollecter, self).__init__()
|
|
||||||
self.key_value_pairs = {}
|
|
||||||
|
|
||||||
def assignment(self, key, value):
|
|
||||||
self.key_value_pairs[key] = value
|
|
||||||
|
|
||||||
def new_section(self, section):
|
|
||||||
pass
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def collect_properties(cls, lineiter, sample_format=False):
|
|
||||||
def clean_sample(f):
|
|
||||||
for line in f:
|
|
||||||
if line.startswith("# ") and line != '# nova.conf sample #\n':
|
|
||||||
line = line[2:]
|
|
||||||
yield line
|
|
||||||
pc = cls()
|
|
||||||
if sample_format:
|
|
||||||
lineiter = clean_sample(lineiter)
|
|
||||||
pc.parse(lineiter)
|
|
||||||
return pc.key_value_pairs
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
parser = argparse.ArgumentParser(description='''Compare the nova.conf
|
|
||||||
file with the nova.conf.sample file to find any unused options or
|
|
||||||
default values in nova.conf''')
|
|
||||||
|
|
||||||
parser.add_argument('-c', action='store',
|
|
||||||
default='/etc/nova/nova.conf',
|
|
||||||
help='path to nova.conf\
|
|
||||||
(defaults to /etc/nova/nova.conf)')
|
|
||||||
parser.add_argument('-s', default='./etc/nova/nova.conf.sample',
|
|
||||||
help='path to nova.conf.sample\
|
|
||||||
(defaults to ./etc/nova/nova.conf.sample')
|
|
||||||
options = parser.parse_args()
|
|
||||||
|
|
||||||
conf_file_options = PropertyCollecter.collect_properties(open(options.c))
|
|
||||||
sample_conf_file_options = PropertyCollecter.collect_properties(
|
|
||||||
open(options.s), sample_format=True)
|
|
||||||
|
|
||||||
for k, v in sorted(conf_file_options.items()):
|
|
||||||
if k not in sample_conf_file_options:
|
|
||||||
print "Unused:", k
|
|
||||||
for k, v in sorted(conf_file_options.items()):
|
|
||||||
if k in sample_conf_file_options and v == sample_conf_file_options[k]:
|
|
||||||
print "Default valued:", k
|
|
@ -1,269 +0,0 @@
|
|||||||
# vim: tabstop=4 shiftwidth=4 softtabstop=4
|
|
||||||
|
|
||||||
# Copyright 2012 SINA Corporation
|
|
||||||
# All Rights Reserved.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
#
|
|
||||||
# @author: Zhongyue Luo, SINA Corporation.
|
|
||||||
#
|
|
||||||
|
|
||||||
"""Extracts OpenStack config option info from module(s)."""
|
|
||||||
|
|
||||||
import imp
|
|
||||||
import os
|
|
||||||
import re
|
|
||||||
import socket
|
|
||||||
import sys
|
|
||||||
import textwrap
|
|
||||||
|
|
||||||
from oslo.config import cfg
|
|
||||||
|
|
||||||
from nova.openstack.common import importutils
|
|
||||||
|
|
||||||
|
|
||||||
STROPT = "StrOpt"
|
|
||||||
BOOLOPT = "BoolOpt"
|
|
||||||
INTOPT = "IntOpt"
|
|
||||||
FLOATOPT = "FloatOpt"
|
|
||||||
LISTOPT = "ListOpt"
|
|
||||||
MULTISTROPT = "MultiStrOpt"
|
|
||||||
|
|
||||||
OPT_TYPES = {
|
|
||||||
STROPT: 'string value',
|
|
||||||
BOOLOPT: 'boolean value',
|
|
||||||
INTOPT: 'integer value',
|
|
||||||
FLOATOPT: 'floating point value',
|
|
||||||
LISTOPT: 'list value',
|
|
||||||
MULTISTROPT: 'multi valued',
|
|
||||||
}
|
|
||||||
|
|
||||||
OPTION_COUNT = 0
|
|
||||||
OPTION_REGEX = re.compile(r"(%s)" % "|".join([STROPT, BOOLOPT, INTOPT,
|
|
||||||
FLOATOPT, LISTOPT,
|
|
||||||
MULTISTROPT]))
|
|
||||||
|
|
||||||
PY_EXT = ".py"
|
|
||||||
BASEDIR = os.path.abspath(os.path.join(os.path.dirname(__file__), "../../"))
|
|
||||||
WORDWRAP_WIDTH = 60
|
|
||||||
|
|
||||||
|
|
||||||
def main(srcfiles):
|
|
||||||
mods_by_pkg = dict()
|
|
||||||
for filepath in srcfiles:
|
|
||||||
pkg_name = filepath.split(os.sep)[1]
|
|
||||||
mod_str = '.'.join(['.'.join(filepath.split(os.sep)[:-1]),
|
|
||||||
os.path.basename(filepath).split('.')[0]])
|
|
||||||
mods_by_pkg.setdefault(pkg_name, list()).append(mod_str)
|
|
||||||
# NOTE(lzyeval): place top level modules before packages
|
|
||||||
pkg_names = filter(lambda x: x.endswith(PY_EXT), mods_by_pkg.keys())
|
|
||||||
pkg_names.sort()
|
|
||||||
ext_names = filter(lambda x: x not in pkg_names, mods_by_pkg.keys())
|
|
||||||
ext_names.sort()
|
|
||||||
pkg_names.extend(ext_names)
|
|
||||||
|
|
||||||
# opts_by_group is a mapping of group name to an options list
|
|
||||||
# The options list is a list of (module, options) tuples
|
|
||||||
opts_by_group = {'DEFAULT': []}
|
|
||||||
|
|
||||||
for pkg_name in pkg_names:
|
|
||||||
mods = mods_by_pkg.get(pkg_name)
|
|
||||||
mods.sort()
|
|
||||||
for mod_str in mods:
|
|
||||||
if mod_str.endswith('.__init__'):
|
|
||||||
mod_str = mod_str[:mod_str.rfind(".")]
|
|
||||||
|
|
||||||
mod_obj = _import_module(mod_str)
|
|
||||||
if not mod_obj:
|
|
||||||
continue
|
|
||||||
|
|
||||||
for group, opts in _list_opts(mod_obj):
|
|
||||||
opts_by_group.setdefault(group, []).append((mod_str, opts))
|
|
||||||
|
|
||||||
print_group_opts('DEFAULT', opts_by_group.pop('DEFAULT', []))
|
|
||||||
for group, opts in opts_by_group.items():
|
|
||||||
print_group_opts(group, opts)
|
|
||||||
|
|
||||||
print "# Total option count: %d" % OPTION_COUNT
|
|
||||||
|
|
||||||
|
|
||||||
def _import_module(mod_str):
|
|
||||||
try:
|
|
||||||
if mod_str.startswith('bin.'):
|
|
||||||
imp.load_source(mod_str[4:], os.path.join('bin', mod_str[4:]))
|
|
||||||
return sys.modules[mod_str[4:]]
|
|
||||||
else:
|
|
||||||
return importutils.import_module(mod_str)
|
|
||||||
except (ValueError, AttributeError), err:
|
|
||||||
return None
|
|
||||||
except ImportError, ie:
|
|
||||||
sys.stderr.write("%s\n" % str(ie))
|
|
||||||
return None
|
|
||||||
except Exception, e:
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
def _guess_groups(opt, mod_obj):
|
|
||||||
groups = []
|
|
||||||
|
|
||||||
# is it in the DEFAULT group?
|
|
||||||
if (opt.dest in cfg.CONF and
|
|
||||||
not isinstance(cfg.CONF[opt.dest], cfg.CONF.GroupAttr)):
|
|
||||||
groups.append('DEFAULT')
|
|
||||||
|
|
||||||
# what other groups is it in?
|
|
||||||
for key, value in cfg.CONF.items():
|
|
||||||
if not isinstance(value, cfg.CONF.GroupAttr):
|
|
||||||
continue
|
|
||||||
if opt.dest not in value:
|
|
||||||
continue
|
|
||||||
groups.append(key)
|
|
||||||
|
|
||||||
if len(groups) == 1:
|
|
||||||
return groups[0]
|
|
||||||
|
|
||||||
group = None
|
|
||||||
for g in groups:
|
|
||||||
if g in mod_obj.__name__:
|
|
||||||
group = g
|
|
||||||
break
|
|
||||||
|
|
||||||
if group is None and 'DEFAULT' in groups:
|
|
||||||
sys.stderr.write("Guessing that " + opt.dest +
|
|
||||||
" in " + mod_obj.__name__ +
|
|
||||||
" is in DEFAULT group out of " +
|
|
||||||
','.join(groups) + "\n")
|
|
||||||
return 'DEFAULT'
|
|
||||||
|
|
||||||
if group is None:
|
|
||||||
sys.stderr.write("Unable to guess what group " + opt.dest +
|
|
||||||
" in " + mod_obj.__name__ +
|
|
||||||
" is in out of " + ','.join(groups) + "\n")
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
sys.stderr.write("Guessing that " + opt.dest +
|
|
||||||
" in " + mod_obj.__name__ +
|
|
||||||
" is in the " + group +
|
|
||||||
" group out of " + ','.join(groups) + "\n")
|
|
||||||
return group
|
|
||||||
|
|
||||||
|
|
||||||
def _list_opts(obj):
|
|
||||||
def is_opt(o):
|
|
||||||
return (isinstance(o, cfg.Opt) and
|
|
||||||
not isinstance(o, cfg.SubCommandOpt))
|
|
||||||
|
|
||||||
opts = list()
|
|
||||||
for attr_str in dir(obj):
|
|
||||||
attr_obj = getattr(obj, attr_str)
|
|
||||||
if is_opt(attr_obj):
|
|
||||||
opts.append(attr_obj)
|
|
||||||
elif (isinstance(attr_obj, list) and
|
|
||||||
all(map(lambda x: is_opt(x), attr_obj))):
|
|
||||||
opts.extend(attr_obj)
|
|
||||||
|
|
||||||
ret = {}
|
|
||||||
for opt in opts:
|
|
||||||
ret.setdefault(_guess_groups(opt, obj), []).append(opt)
|
|
||||||
return ret.items()
|
|
||||||
|
|
||||||
|
|
||||||
def print_group_opts(group, opts_by_module):
|
|
||||||
print "[%s]" % group
|
|
||||||
print
|
|
||||||
global OPTION_COUNT
|
|
||||||
for mod, opts in opts_by_module:
|
|
||||||
OPTION_COUNT += len(opts)
|
|
||||||
print '#'
|
|
||||||
print '# Options defined in %s' % mod
|
|
||||||
print '#'
|
|
||||||
print
|
|
||||||
for opt in opts:
|
|
||||||
_print_opt(opt)
|
|
||||||
print
|
|
||||||
|
|
||||||
|
|
||||||
def _get_my_ip():
|
|
||||||
try:
|
|
||||||
csock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
|
||||||
csock.connect(('8.8.8.8', 80))
|
|
||||||
(addr, port) = csock.getsockname()
|
|
||||||
csock.close()
|
|
||||||
return addr
|
|
||||||
except socket.error:
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
def _sanitize_default(s):
|
|
||||||
"""Set up a reasonably sensible default for pybasedir, my_ip and host."""
|
|
||||||
if s.startswith(BASEDIR):
|
|
||||||
return s.replace(BASEDIR, '/usr/lib/python/site-packages')
|
|
||||||
elif BASEDIR in s:
|
|
||||||
return s.replace(BASEDIR, '')
|
|
||||||
elif s == _get_my_ip():
|
|
||||||
return '10.0.0.1'
|
|
||||||
elif s == socket.getfqdn():
|
|
||||||
return 'nova'
|
|
||||||
elif s.strip() != s:
|
|
||||||
return '"%s"' % s
|
|
||||||
return s
|
|
||||||
|
|
||||||
|
|
||||||
def _print_opt(opt):
|
|
||||||
opt_name, opt_default, opt_help = opt.dest, opt.default, opt.help
|
|
||||||
if not opt_help:
|
|
||||||
sys.stderr.write('WARNING: "%s" is missing help string.\n' % opt_name)
|
|
||||||
opt_type = None
|
|
||||||
try:
|
|
||||||
opt_type = OPTION_REGEX.search(str(type(opt))).group(0)
|
|
||||||
except (ValueError, AttributeError), err:
|
|
||||||
sys.stderr.write("%s\n" % str(err))
|
|
||||||
sys.exit(1)
|
|
||||||
opt_help += ' (' + OPT_TYPES[opt_type] + ')'
|
|
||||||
print '#', "\n# ".join(textwrap.wrap(opt_help, WORDWRAP_WIDTH))
|
|
||||||
try:
|
|
||||||
if opt_default is None:
|
|
||||||
print '#%s=<None>' % opt_name
|
|
||||||
elif opt_type == STROPT:
|
|
||||||
assert(isinstance(opt_default, basestring))
|
|
||||||
print '#%s=%s' % (opt_name, _sanitize_default(opt_default))
|
|
||||||
elif opt_type == BOOLOPT:
|
|
||||||
assert(isinstance(opt_default, bool))
|
|
||||||
print '#%s=%s' % (opt_name, str(opt_default).lower())
|
|
||||||
elif opt_type == INTOPT:
|
|
||||||
assert(isinstance(opt_default, int) and
|
|
||||||
not isinstance(opt_default, bool))
|
|
||||||
print '#%s=%s' % (opt_name, opt_default)
|
|
||||||
elif opt_type == FLOATOPT:
|
|
||||||
assert(isinstance(opt_default, float))
|
|
||||||
print '#%s=%s' % (opt_name, opt_default)
|
|
||||||
elif opt_type == LISTOPT:
|
|
||||||
assert(isinstance(opt_default, list))
|
|
||||||
print '#%s=%s' % (opt_name, ','.join(opt_default))
|
|
||||||
elif opt_type == MULTISTROPT:
|
|
||||||
assert(isinstance(opt_default, list))
|
|
||||||
if not opt_default:
|
|
||||||
opt_default = ['']
|
|
||||||
for default in opt_default:
|
|
||||||
print '#%s=%s' % (opt_name, default)
|
|
||||||
print
|
|
||||||
except Exception:
|
|
||||||
sys.stderr.write('Error in option "%s"\n' % opt_name)
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
if len(sys.argv) < 2:
|
|
||||||
print "usage: python %s [srcfile]...\n" % sys.argv[0]
|
|
||||||
sys.exit(0)
|
|
||||||
main(sys.argv[1:])
|
|
@ -1,31 +0,0 @@
|
|||||||
#!/usr/bin/env bash
|
|
||||||
# vim: tabstop=4 shiftwidth=4 softtabstop=4
|
|
||||||
|
|
||||||
# Copyright 2012 SINA Corporation
|
|
||||||
# All Rights Reserved.
|
|
||||||
# Author: Zhongyue Luo <lzyeval@gmail.com>
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
FILES=$(find nova -type f -name "*.py" ! -path "nova/tests/*" -exec \
|
|
||||||
grep -l "Opt(" {} \; | sort -u)
|
|
||||||
BINS=$(echo bin/nova-*)
|
|
||||||
|
|
||||||
PYTHONPATH=./:${PYTHONPATH} \
|
|
||||||
python $(dirname "$0")/extract_opts.py ${FILES} ${BINS} > \
|
|
||||||
etc/nova/nova.conf.sample
|
|
||||||
|
|
||||||
# Remove compiled files created by imp.import_source()
|
|
||||||
for bin in ${BINS}; do
|
|
||||||
[ -f ${bin}c ] && rm ${bin}c
|
|
||||||
done
|
|
@ -1,270 +0,0 @@
|
|||||||
#!/usr/bin/env python
|
|
||||||
|
|
||||||
# vim: tabstop=4 shiftwidth=4 softtabstop=4
|
|
||||||
|
|
||||||
# Copyright 2012 OpenStack Foundation
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
"""
|
|
||||||
Utility for diff'ing two versions of the DB schema.
|
|
||||||
|
|
||||||
Each release cycle the plan is to compact all of the migrations from that
|
|
||||||
release into a single file. This is a manual and, unfortunately, error-prone
|
|
||||||
process. To ensure that the schema doesn't change, this tool can be used to
|
|
||||||
diff the compacted DB schema to the original, uncompacted form.
|
|
||||||
|
|
||||||
|
|
||||||
The schema versions are specified by providing a git ref (a branch name or
|
|
||||||
commit hash) and a SQLAlchemy-Migrate version number:
|
|
||||||
Run like:
|
|
||||||
|
|
||||||
./tools/db/schema_diff.py mysql master:latest my_branch:82
|
|
||||||
"""
|
|
||||||
import datetime
|
|
||||||
import glob
|
|
||||||
import os
|
|
||||||
import subprocess
|
|
||||||
import sys
|
|
||||||
|
|
||||||
|
|
||||||
### Dump
|
|
||||||
|
|
||||||
|
|
||||||
def dump_db(db_driver, db_name, migration_version, dump_filename):
|
|
||||||
db_driver.create(db_name)
|
|
||||||
try:
|
|
||||||
migrate(db_driver, db_name, migration_version)
|
|
||||||
db_driver.dump(db_name, dump_filename)
|
|
||||||
finally:
|
|
||||||
db_driver.drop(db_name)
|
|
||||||
|
|
||||||
|
|
||||||
### Diff
|
|
||||||
|
|
||||||
|
|
||||||
def diff_files(filename1, filename2):
|
|
||||||
pipeline = ['diff -U 3 %(filename1)s %(filename2)s' % locals()]
|
|
||||||
|
|
||||||
# Use colordiff if available
|
|
||||||
if subprocess.call(['which', 'colordiff']) == 0:
|
|
||||||
pipeline.append('colordiff')
|
|
||||||
|
|
||||||
pipeline.append('less -R')
|
|
||||||
|
|
||||||
cmd = ' | '.join(pipeline)
|
|
||||||
subprocess.check_call(cmd, shell=True)
|
|
||||||
|
|
||||||
|
|
||||||
### Database
|
|
||||||
|
|
||||||
|
|
||||||
class MySQL(object):
|
|
||||||
def create(self, name):
|
|
||||||
subprocess.check_call(['mysqladmin', '-u', 'root', 'create', name])
|
|
||||||
|
|
||||||
def drop(self, name):
|
|
||||||
subprocess.check_call(['mysqladmin', '-f', '-u', 'root', 'drop', name])
|
|
||||||
|
|
||||||
def dump(self, name, dump_filename):
|
|
||||||
subprocess.check_call(
|
|
||||||
'mysqldump -u root %(name)s > %(dump_filename)s' % locals(),
|
|
||||||
shell=True)
|
|
||||||
|
|
||||||
def url(self, name):
|
|
||||||
return 'mysql://root@localhost/%s' % name
|
|
||||||
|
|
||||||
|
|
||||||
class Postgres(object):
|
|
||||||
def create(self, name):
|
|
||||||
subprocess.check_call(['createdb', name])
|
|
||||||
|
|
||||||
def drop(self, name):
|
|
||||||
subprocess.check_call(['dropdb', name])
|
|
||||||
|
|
||||||
def dump(self, name, dump_filename):
|
|
||||||
subprocess.check_call(
|
|
||||||
'pg_dump %(name)s > %(dump_filename)s' % locals(),
|
|
||||||
shell=True)
|
|
||||||
|
|
||||||
def url(self, name):
|
|
||||||
return 'postgres://localhost/%s' % name
|
|
||||||
|
|
||||||
|
|
||||||
def _get_db_driver_class(db_type):
|
|
||||||
if db_type == "mysql":
|
|
||||||
return MySQL
|
|
||||||
elif db_type == "postgres":
|
|
||||||
return Postgres
|
|
||||||
else:
|
|
||||||
raise Exception(_("database %s not supported") % db_type)
|
|
||||||
|
|
||||||
|
|
||||||
### Migrate
|
|
||||||
|
|
||||||
|
|
||||||
MIGRATE_REPO = os.path.join(os.getcwd(), "nova/db/sqlalchemy/migrate_repo")
|
|
||||||
|
|
||||||
|
|
||||||
def migrate(db_driver, db_name, migration_version):
|
|
||||||
earliest_version = _migrate_get_earliest_version()
|
|
||||||
|
|
||||||
# NOTE(sirp): sqlalchemy-migrate currently cannot handle the skipping of
|
|
||||||
# migration numbers.
|
|
||||||
_migrate_cmd(
|
|
||||||
db_driver, db_name, 'version_control', str(earliest_version - 1))
|
|
||||||
|
|
||||||
upgrade_cmd = ['upgrade']
|
|
||||||
if migration_version != 'latest':
|
|
||||||
upgrade_cmd.append(str(migration_version))
|
|
||||||
|
|
||||||
_migrate_cmd(db_driver, db_name, *upgrade_cmd)
|
|
||||||
|
|
||||||
|
|
||||||
def _migrate_cmd(db_driver, db_name, *cmd):
|
|
||||||
manage_py = os.path.join(MIGRATE_REPO, 'manage.py')
|
|
||||||
|
|
||||||
args = ['python', manage_py]
|
|
||||||
args += cmd
|
|
||||||
args += ['--repository=%s' % MIGRATE_REPO,
|
|
||||||
'--url=%s' % db_driver.url(db_name)]
|
|
||||||
|
|
||||||
subprocess.check_call(args)
|
|
||||||
|
|
||||||
|
|
||||||
def _migrate_get_earliest_version():
|
|
||||||
versions_glob = os.path.join(MIGRATE_REPO, 'versions', '???_*.py')
|
|
||||||
|
|
||||||
versions = []
|
|
||||||
for path in glob.iglob(versions_glob):
|
|
||||||
filename = os.path.basename(path)
|
|
||||||
prefix = filename.split('_', 1)[0]
|
|
||||||
try:
|
|
||||||
version = int(prefix)
|
|
||||||
except ValueError:
|
|
||||||
pass
|
|
||||||
versions.append(version)
|
|
||||||
|
|
||||||
versions.sort()
|
|
||||||
return versions[0]
|
|
||||||
|
|
||||||
|
|
||||||
### Git
|
|
||||||
|
|
||||||
|
|
||||||
def git_current_branch_name():
|
|
||||||
ref_name = git_symbolic_ref('HEAD', quiet=True)
|
|
||||||
current_branch_name = ref_name.replace('refs/heads/', '')
|
|
||||||
return current_branch_name
|
|
||||||
|
|
||||||
|
|
||||||
def git_symbolic_ref(ref, quiet=False):
|
|
||||||
args = ['git', 'symbolic-ref', ref]
|
|
||||||
if quiet:
|
|
||||||
args.append('-q')
|
|
||||||
proc = subprocess.Popen(args, stdout=subprocess.PIPE)
|
|
||||||
stdout, stderr = proc.communicate()
|
|
||||||
return stdout.strip()
|
|
||||||
|
|
||||||
|
|
||||||
def git_checkout(branch_name):
|
|
||||||
subprocess.check_call(['git', 'checkout', branch_name])
|
|
||||||
|
|
||||||
|
|
||||||
def git_has_uncommited_changes():
|
|
||||||
return subprocess.call(['git', 'diff', '--quiet', '--exit-code']) == 1
|
|
||||||
|
|
||||||
|
|
||||||
### Command
|
|
||||||
|
|
||||||
|
|
||||||
def die(msg):
|
|
||||||
print >> sys.stderr, "ERROR: %s" % msg
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
|
|
||||||
def usage(msg=None):
|
|
||||||
if msg:
|
|
||||||
print >> sys.stderr, "ERROR: %s" % msg
|
|
||||||
|
|
||||||
prog = "schema_diff.py"
|
|
||||||
args = ["<mysql|postgres>", "<orig-branch:orig-version>",
|
|
||||||
"<new-branch:new-version>"]
|
|
||||||
|
|
||||||
print >> sys.stderr, "usage: %s %s" % (prog, ' '.join(args))
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
|
|
||||||
def parse_options():
|
|
||||||
try:
|
|
||||||
db_type = sys.argv[1]
|
|
||||||
except IndexError:
|
|
||||||
usage("must specify DB type")
|
|
||||||
|
|
||||||
try:
|
|
||||||
orig_branch, orig_version = sys.argv[2].split(':')
|
|
||||||
except IndexError:
|
|
||||||
usage('original branch and version required (e.g. master:82)')
|
|
||||||
|
|
||||||
try:
|
|
||||||
new_branch, new_version = sys.argv[3].split(':')
|
|
||||||
except IndexError:
|
|
||||||
usage('new branch and version required (e.g. master:82)')
|
|
||||||
|
|
||||||
return db_type, orig_branch, orig_version, new_branch, new_version
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
timestamp = datetime.datetime.utcnow().strftime("%Y%m%d_%H%M%S")
|
|
||||||
|
|
||||||
ORIG_DB = 'orig_db_%s' % timestamp
|
|
||||||
NEW_DB = 'new_db_%s' % timestamp
|
|
||||||
|
|
||||||
ORIG_DUMP = ORIG_DB + ".dump"
|
|
||||||
NEW_DUMP = NEW_DB + ".dump"
|
|
||||||
|
|
||||||
options = parse_options()
|
|
||||||
db_type, orig_branch, orig_version, new_branch, new_version = options
|
|
||||||
|
|
||||||
# Since we're going to be switching branches, ensure user doesn't have any
|
|
||||||
# uncommited changes
|
|
||||||
if git_has_uncommited_changes():
|
|
||||||
die("You have uncommited changes. Please commit them before running "
|
|
||||||
"this command.")
|
|
||||||
|
|
||||||
db_driver = _get_db_driver_class(db_type)()
|
|
||||||
|
|
||||||
users_branch = git_current_branch_name()
|
|
||||||
git_checkout(orig_branch)
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Dump Original Schema
|
|
||||||
dump_db(db_driver, ORIG_DB, orig_version, ORIG_DUMP)
|
|
||||||
|
|
||||||
# Dump New Schema
|
|
||||||
git_checkout(new_branch)
|
|
||||||
dump_db(db_driver, NEW_DB, new_version, NEW_DUMP)
|
|
||||||
|
|
||||||
diff_files(ORIG_DUMP, NEW_DUMP)
|
|
||||||
finally:
|
|
||||||
git_checkout(users_branch)
|
|
||||||
|
|
||||||
if os.path.exists(ORIG_DUMP):
|
|
||||||
os.unlink(ORIG_DUMP)
|
|
||||||
|
|
||||||
if os.path.exists(NEW_DUMP):
|
|
||||||
os.unlink(NEW_DUMP)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
0
doc/source/_templates/.placeholder
Normal file
0
doc/source/_templates/.placeholder
Normal file
83
doc/source/_theme/layout.html
Normal file
83
doc/source/_theme/layout.html
Normal file
@ -0,0 +1,83 @@
|
|||||||
|
{% extends "basic/layout.html" %}
|
||||||
|
{% set css_files = css_files + ['_static/tweaks.css'] %}
|
||||||
|
{% set script_files = script_files + ['_static/jquery.tweet.js'] %}
|
||||||
|
|
||||||
|
{%- macro sidebar() %}
|
||||||
|
{%- if not embedded %}{% if not theme_nosidebar|tobool %}
|
||||||
|
<div class="sphinxsidebar">
|
||||||
|
<div class="sphinxsidebarwrapper">
|
||||||
|
{%- block sidebarlogo %}
|
||||||
|
{%- if logo %}
|
||||||
|
<p class="logo"><a href="{{ pathto(master_doc) }}">
|
||||||
|
<img class="logo" src="{{ pathto('_static/' + logo, 1) }}" alt="Logo"/>
|
||||||
|
</a></p>
|
||||||
|
{%- endif %}
|
||||||
|
{%- endblock %}
|
||||||
|
{%- block sidebartoc %}
|
||||||
|
{%- if display_toc %}
|
||||||
|
<h3><a href="{{ pathto(master_doc) }}">{{ _('Table Of Contents') }}</a></h3>
|
||||||
|
{{ toc }}
|
||||||
|
{%- endif %}
|
||||||
|
{%- endblock %}
|
||||||
|
{%- block sidebarrel %}
|
||||||
|
{%- if prev %}
|
||||||
|
<h4>{{ _('Previous topic') }}</h4>
|
||||||
|
<p class="topless"><a href="{{ prev.link|e }}"
|
||||||
|
title="{{ _('previous chapter') }}">{{ prev.title }}</a></p>
|
||||||
|
{%- endif %}
|
||||||
|
{%- if next %}
|
||||||
|
<h4>{{ _('Next topic') }}</h4>
|
||||||
|
<p class="topless"><a href="{{ next.link|e }}"
|
||||||
|
title="{{ _('next chapter') }}">{{ next.title }}</a></p>
|
||||||
|
{%- endif %}
|
||||||
|
{%- endblock %}
|
||||||
|
{%- block sidebarsourcelink %}
|
||||||
|
{%- if show_source and has_source and sourcename %}
|
||||||
|
<h3>{{ _('This Page') }}</h3>
|
||||||
|
<ul class="this-page-menu">
|
||||||
|
<li><a href="{{ pathto('_sources/' + sourcename, true)|e }}"
|
||||||
|
rel="nofollow">{{ _('Show Source') }}</a></li>
|
||||||
|
</ul>
|
||||||
|
{%- endif %}
|
||||||
|
{%- endblock %}
|
||||||
|
{%- if customsidebar %}
|
||||||
|
{% include customsidebar %}
|
||||||
|
{%- endif %}
|
||||||
|
{%- block sidebarsearch %}
|
||||||
|
{%- if pagename != "search" %}
|
||||||
|
<div id="searchbox" style="display: none">
|
||||||
|
<h3>{{ _('Quick search') }}</h3>
|
||||||
|
<form class="search" action="{{ pathto('search') }}" method="get">
|
||||||
|
<input type="text" name="q" size="18" />
|
||||||
|
<input type="submit" value="{{ _('Go') }}" />
|
||||||
|
<input type="hidden" name="check_keywords" value="yes" />
|
||||||
|
<input type="hidden" name="area" value="default" />
|
||||||
|
</form>
|
||||||
|
<p class="searchtip" style="font-size: 90%">
|
||||||
|
{{ _('Enter search terms or a module, class or function name.') }}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<script type="text/javascript">$('#searchbox').show(0);</script>
|
||||||
|
{%- endif %}
|
||||||
|
{%- endblock %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{%- endif %}{% endif %}
|
||||||
|
{%- endmacro %}
|
||||||
|
|
||||||
|
{% block relbar1 %}{% endblock relbar1 %}
|
||||||
|
|
||||||
|
{% block header %}
|
||||||
|
<div id="header">
|
||||||
|
<h1 id="logo"><a href="http://www.openstack.org/">OpenStack</a></h1>
|
||||||
|
<ul id="navigation">
|
||||||
|
<li><a href="http://www.openstack.org/" title="Go to the Home page" class="link">Home</a></li>
|
||||||
|
<li><a href="http://www.openstack.org/projects/" title="Go to the OpenStack Projects page">Projects</a></li>
|
||||||
|
<li><a href="http://www.openstack.org/user-stories/" title="Go to the User Stories page" class="link">User Stories</a></li>
|
||||||
|
<li><a href="http://www.openstack.org/community/" title="Go to the Community page" class="link">Community</a></li>
|
||||||
|
<li><a href="http://www.openstack.org/blog/" title="Go to the OpenStack Blog">Blog</a></li>
|
||||||
|
<li><a href="http://wiki.openstack.org/" title="Go to the OpenStack Wiki">Wiki</a></li>
|
||||||
|
<li><a href="http://docs.openstack.org/" title="Go to OpenStack Documentation" class="current">Documentation</a></li>
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
{% endblock %}
|
4
doc/source/_theme/theme.conf
Normal file
4
doc/source/_theme/theme.conf
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
[theme]
|
||||||
|
inherit = basic
|
||||||
|
stylesheet = nature.css
|
||||||
|
pygments_style = tango
|
62
doc/source/conf.py
Normal file
62
doc/source/conf.py
Normal file
@ -0,0 +1,62 @@
|
|||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
sys.path.insert(0, os.path.abspath('../..'))
|
||||||
|
# -- General configuration ----------------------------------------------------
|
||||||
|
|
||||||
|
# Add any Sphinx extension module names here, as strings. They can be
|
||||||
|
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
|
||||||
|
extensions = ['sphinx.ext.autodoc', 'sphinx.ext.intersphinx']
|
||||||
|
|
||||||
|
# autodoc generation is a bit aggressive and a nuisance when doing heavy
|
||||||
|
# text edit cycles.
|
||||||
|
# execute "export SPHINX_DEBUG=1" in your terminal to disable
|
||||||
|
|
||||||
|
# Add any paths that contain templates here, relative to this directory.
|
||||||
|
templates_path = ['_templates']
|
||||||
|
|
||||||
|
# The suffix of source filenames.
|
||||||
|
source_suffix = '.rst'
|
||||||
|
|
||||||
|
# The master toctree document.
|
||||||
|
master_doc = 'index'
|
||||||
|
|
||||||
|
# General information about the project.
|
||||||
|
project = u'hacking'
|
||||||
|
copyright = u'2013, OpenStack Foundation'
|
||||||
|
|
||||||
|
# If true, '()' will be appended to :func: etc. cross-reference text.
|
||||||
|
add_function_parentheses = True
|
||||||
|
|
||||||
|
# If true, the current module name will be prepended to all description
|
||||||
|
# unit titles (such as .. function::).
|
||||||
|
add_module_names = True
|
||||||
|
|
||||||
|
# The name of the Pygments (syntax highlighting) style to use.
|
||||||
|
pygments_style = 'sphinx'
|
||||||
|
|
||||||
|
# -- Options for HTML output --------------------------------------------------
|
||||||
|
|
||||||
|
# The theme to use for HTML and HTML Help pages. Major themes that come with
|
||||||
|
# Sphinx are currently 'default' and 'sphinxdoc'.
|
||||||
|
html_theme_path = ["."]
|
||||||
|
html_theme = '_theme'
|
||||||
|
html_static_path = ['static']
|
||||||
|
|
||||||
|
# Output file base name for HTML help builder.
|
||||||
|
htmlhelp_basename = '%sdoc' % project
|
||||||
|
|
||||||
|
# Grouping the document tree into LaTeX files. List of tuples
|
||||||
|
# (source start file, target name, title, author, documentclass
|
||||||
|
# [howto/manual]).
|
||||||
|
latex_documents = [
|
||||||
|
('index',
|
||||||
|
'%s.tex' % project,
|
||||||
|
u'%s Documentation' % project,
|
||||||
|
u'OpenStack LLC', 'manual'),
|
||||||
|
]
|
||||||
|
|
||||||
|
# Example configuration for intersphinx: refer to the Python standard library.
|
||||||
|
intersphinx_mapping = {'http://docs.python.org/': None}
|
21
doc/source/index.rst
Normal file
21
doc/source/index.rst
Normal file
@ -0,0 +1,21 @@
|
|||||||
|
hacking
|
||||||
|
=======
|
||||||
|
|
||||||
|
hacking is a set of flake8 plugins to test or enforce the more stringent
|
||||||
|
style guidelines that the OpenStack project operates under.
|
||||||
|
|
||||||
|
Contents
|
||||||
|
--------
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
|
||||||
|
api/autoindex
|
||||||
|
|
||||||
|
Indices and tables
|
||||||
|
==================
|
||||||
|
|
||||||
|
* :ref:`genindex`
|
||||||
|
* :ref:`modindex`
|
||||||
|
* :ref:`search`
|
||||||
|
|
416
doc/source/static/basic.css
Normal file
416
doc/source/static/basic.css
Normal file
@ -0,0 +1,416 @@
|
|||||||
|
/**
|
||||||
|
* Sphinx stylesheet -- basic theme
|
||||||
|
* ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
*/
|
||||||
|
|
||||||
|
/* -- main layout ----------------------------------------------------------- */
|
||||||
|
|
||||||
|
div.clearer {
|
||||||
|
clear: both;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- relbar ---------------------------------------------------------------- */
|
||||||
|
|
||||||
|
div.related {
|
||||||
|
width: 100%;
|
||||||
|
font-size: 90%;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.related h3 {
|
||||||
|
display: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.related ul {
|
||||||
|
margin: 0;
|
||||||
|
padding: 0 0 0 10px;
|
||||||
|
list-style: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.related li {
|
||||||
|
display: inline;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.related li.right {
|
||||||
|
float: right;
|
||||||
|
margin-right: 5px;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- sidebar --------------------------------------------------------------- */
|
||||||
|
|
||||||
|
div.sphinxsidebarwrapper {
|
||||||
|
padding: 10px 5px 0 10px;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar {
|
||||||
|
float: left;
|
||||||
|
width: 230px;
|
||||||
|
margin-left: -100%;
|
||||||
|
font-size: 90%;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar ul {
|
||||||
|
list-style: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar ul ul,
|
||||||
|
div.sphinxsidebar ul.want-points {
|
||||||
|
margin-left: 20px;
|
||||||
|
list-style: square;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar ul ul {
|
||||||
|
margin-top: 0;
|
||||||
|
margin-bottom: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar form {
|
||||||
|
margin-top: 10px;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar input {
|
||||||
|
border: 1px solid #98dbcc;
|
||||||
|
font-family: sans-serif;
|
||||||
|
font-size: 1em;
|
||||||
|
}
|
||||||
|
|
||||||
|
img {
|
||||||
|
border: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- search page ----------------------------------------------------------- */
|
||||||
|
|
||||||
|
ul.search {
|
||||||
|
margin: 10px 0 0 20px;
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
ul.search li {
|
||||||
|
padding: 5px 0 5px 20px;
|
||||||
|
background-image: url(file.png);
|
||||||
|
background-repeat: no-repeat;
|
||||||
|
background-position: 0 7px;
|
||||||
|
}
|
||||||
|
|
||||||
|
ul.search li a {
|
||||||
|
font-weight: bold;
|
||||||
|
}
|
||||||
|
|
||||||
|
ul.search li div.context {
|
||||||
|
color: #888;
|
||||||
|
margin: 2px 0 0 30px;
|
||||||
|
text-align: left;
|
||||||
|
}
|
||||||
|
|
||||||
|
ul.keywordmatches li.goodmatch a {
|
||||||
|
font-weight: bold;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- index page ------------------------------------------------------------ */
|
||||||
|
|
||||||
|
table.contentstable {
|
||||||
|
width: 90%;
|
||||||
|
}
|
||||||
|
|
||||||
|
table.contentstable p.biglink {
|
||||||
|
line-height: 150%;
|
||||||
|
}
|
||||||
|
|
||||||
|
a.biglink {
|
||||||
|
font-size: 1.3em;
|
||||||
|
}
|
||||||
|
|
||||||
|
span.linkdescr {
|
||||||
|
font-style: italic;
|
||||||
|
padding-top: 5px;
|
||||||
|
font-size: 90%;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- general index --------------------------------------------------------- */
|
||||||
|
|
||||||
|
table.indextable td {
|
||||||
|
text-align: left;
|
||||||
|
vertical-align: top;
|
||||||
|
}
|
||||||
|
|
||||||
|
table.indextable dl, table.indextable dd {
|
||||||
|
margin-top: 0;
|
||||||
|
margin-bottom: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
table.indextable tr.pcap {
|
||||||
|
height: 10px;
|
||||||
|
}
|
||||||
|
|
||||||
|
table.indextable tr.cap {
|
||||||
|
margin-top: 10px;
|
||||||
|
background-color: #f2f2f2;
|
||||||
|
}
|
||||||
|
|
||||||
|
img.toggler {
|
||||||
|
margin-right: 3px;
|
||||||
|
margin-top: 3px;
|
||||||
|
cursor: pointer;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- general body styles --------------------------------------------------- */
|
||||||
|
|
||||||
|
a.headerlink {
|
||||||
|
visibility: hidden;
|
||||||
|
}
|
||||||
|
|
||||||
|
h1:hover > a.headerlink,
|
||||||
|
h2:hover > a.headerlink,
|
||||||
|
h3:hover > a.headerlink,
|
||||||
|
h4:hover > a.headerlink,
|
||||||
|
h5:hover > a.headerlink,
|
||||||
|
h6:hover > a.headerlink,
|
||||||
|
dt:hover > a.headerlink {
|
||||||
|
visibility: visible;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.body p.caption {
|
||||||
|
text-align: inherit;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.body td {
|
||||||
|
text-align: left;
|
||||||
|
}
|
||||||
|
|
||||||
|
.field-list ul {
|
||||||
|
padding-left: 1em;
|
||||||
|
}
|
||||||
|
|
||||||
|
.first {
|
||||||
|
}
|
||||||
|
|
||||||
|
p.rubric {
|
||||||
|
margin-top: 30px;
|
||||||
|
font-weight: bold;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- sidebars -------------------------------------------------------------- */
|
||||||
|
|
||||||
|
div.sidebar {
|
||||||
|
margin: 0 0 0.5em 1em;
|
||||||
|
border: 1px solid #ddb;
|
||||||
|
padding: 7px 7px 0 7px;
|
||||||
|
background-color: #ffe;
|
||||||
|
width: 40%;
|
||||||
|
float: right;
|
||||||
|
}
|
||||||
|
|
||||||
|
p.sidebar-title {
|
||||||
|
font-weight: bold;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- topics ---------------------------------------------------------------- */
|
||||||
|
|
||||||
|
div.topic {
|
||||||
|
border: 1px solid #ccc;
|
||||||
|
padding: 7px 7px 0 7px;
|
||||||
|
margin: 10px 0 10px 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
p.topic-title {
|
||||||
|
font-size: 1.1em;
|
||||||
|
font-weight: bold;
|
||||||
|
margin-top: 10px;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- admonitions ----------------------------------------------------------- */
|
||||||
|
|
||||||
|
div.admonition {
|
||||||
|
margin-top: 10px;
|
||||||
|
margin-bottom: 10px;
|
||||||
|
padding: 7px;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.admonition dt {
|
||||||
|
font-weight: bold;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.admonition dl {
|
||||||
|
margin-bottom: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
p.admonition-title {
|
||||||
|
margin: 0px 10px 5px 0px;
|
||||||
|
font-weight: bold;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.body p.centered {
|
||||||
|
text-align: center;
|
||||||
|
margin-top: 25px;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- tables ---------------------------------------------------------------- */
|
||||||
|
|
||||||
|
table.docutils {
|
||||||
|
border: 0;
|
||||||
|
border-collapse: collapse;
|
||||||
|
}
|
||||||
|
|
||||||
|
table.docutils td, table.docutils th {
|
||||||
|
padding: 1px 8px 1px 0;
|
||||||
|
border-top: 0;
|
||||||
|
border-left: 0;
|
||||||
|
border-right: 0;
|
||||||
|
border-bottom: 1px solid #aaa;
|
||||||
|
}
|
||||||
|
|
||||||
|
table.field-list td, table.field-list th {
|
||||||
|
border: 0 !important;
|
||||||
|
}
|
||||||
|
|
||||||
|
table.footnote td, table.footnote th {
|
||||||
|
border: 0 !important;
|
||||||
|
}
|
||||||
|
|
||||||
|
th {
|
||||||
|
text-align: left;
|
||||||
|
padding-right: 5px;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- other body styles ----------------------------------------------------- */
|
||||||
|
|
||||||
|
dl {
|
||||||
|
margin-bottom: 15px;
|
||||||
|
}
|
||||||
|
|
||||||
|
dd p {
|
||||||
|
margin-top: 0px;
|
||||||
|
}
|
||||||
|
|
||||||
|
dd ul, dd table {
|
||||||
|
margin-bottom: 10px;
|
||||||
|
}
|
||||||
|
|
||||||
|
dd {
|
||||||
|
margin-top: 3px;
|
||||||
|
margin-bottom: 10px;
|
||||||
|
margin-left: 30px;
|
||||||
|
}
|
||||||
|
|
||||||
|
dt:target, .highlight {
|
||||||
|
background-color: #fbe54e;
|
||||||
|
}
|
||||||
|
|
||||||
|
dl.glossary dt {
|
||||||
|
font-weight: bold;
|
||||||
|
font-size: 1.1em;
|
||||||
|
}
|
||||||
|
|
||||||
|
.field-list ul {
|
||||||
|
margin: 0;
|
||||||
|
padding-left: 1em;
|
||||||
|
}
|
||||||
|
|
||||||
|
.field-list p {
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.refcount {
|
||||||
|
color: #060;
|
||||||
|
}
|
||||||
|
|
||||||
|
.optional {
|
||||||
|
font-size: 1.3em;
|
||||||
|
}
|
||||||
|
|
||||||
|
.versionmodified {
|
||||||
|
font-style: italic;
|
||||||
|
}
|
||||||
|
|
||||||
|
.system-message {
|
||||||
|
background-color: #fda;
|
||||||
|
padding: 5px;
|
||||||
|
border: 3px solid red;
|
||||||
|
}
|
||||||
|
|
||||||
|
.footnote:target {
|
||||||
|
background-color: #ffa
|
||||||
|
}
|
||||||
|
|
||||||
|
.line-block {
|
||||||
|
display: block;
|
||||||
|
margin-top: 1em;
|
||||||
|
margin-bottom: 1em;
|
||||||
|
}
|
||||||
|
|
||||||
|
.line-block .line-block {
|
||||||
|
margin-top: 0;
|
||||||
|
margin-bottom: 0;
|
||||||
|
margin-left: 1.5em;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- code displays --------------------------------------------------------- */
|
||||||
|
|
||||||
|
pre {
|
||||||
|
overflow: auto;
|
||||||
|
}
|
||||||
|
|
||||||
|
td.linenos pre {
|
||||||
|
padding: 5px 0px;
|
||||||
|
border: 0;
|
||||||
|
background-color: transparent;
|
||||||
|
color: #aaa;
|
||||||
|
}
|
||||||
|
|
||||||
|
table.highlighttable {
|
||||||
|
margin-left: 0.5em;
|
||||||
|
}
|
||||||
|
|
||||||
|
table.highlighttable td {
|
||||||
|
padding: 0 0.5em 0 0.5em;
|
||||||
|
}
|
||||||
|
|
||||||
|
tt.descname {
|
||||||
|
background-color: transparent;
|
||||||
|
font-weight: bold;
|
||||||
|
font-size: 1.2em;
|
||||||
|
}
|
||||||
|
|
||||||
|
tt.descclassname {
|
||||||
|
background-color: transparent;
|
||||||
|
}
|
||||||
|
|
||||||
|
tt.xref, a tt {
|
||||||
|
background-color: transparent;
|
||||||
|
font-weight: bold;
|
||||||
|
}
|
||||||
|
|
||||||
|
h1 tt, h2 tt, h3 tt, h4 tt, h5 tt, h6 tt {
|
||||||
|
background-color: transparent;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- math display ---------------------------------------------------------- */
|
||||||
|
|
||||||
|
img.math {
|
||||||
|
vertical-align: middle;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.body div.math p {
|
||||||
|
text-align: center;
|
||||||
|
}
|
||||||
|
|
||||||
|
span.eqno {
|
||||||
|
float: right;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- printout stylesheet --------------------------------------------------- */
|
||||||
|
|
||||||
|
@media print {
|
||||||
|
div.document,
|
||||||
|
div.documentwrapper,
|
||||||
|
div.bodywrapper {
|
||||||
|
margin: 0 !important;
|
||||||
|
width: 100%;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar,
|
||||||
|
div.related,
|
||||||
|
div.footer,
|
||||||
|
#top-link {
|
||||||
|
display: none;
|
||||||
|
}
|
||||||
|
}
|
230
doc/source/static/default.css
Normal file
230
doc/source/static/default.css
Normal file
@ -0,0 +1,230 @@
|
|||||||
|
/**
|
||||||
|
* Sphinx stylesheet -- default theme
|
||||||
|
* ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
*/
|
||||||
|
|
||||||
|
@import url("basic.css");
|
||||||
|
|
||||||
|
/* -- page layout ----------------------------------------------------------- */
|
||||||
|
|
||||||
|
body {
|
||||||
|
font-family: sans-serif;
|
||||||
|
font-size: 100%;
|
||||||
|
background-color: #11303d;
|
||||||
|
color: #000;
|
||||||
|
margin: 0;
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.document {
|
||||||
|
background-color: #1c4e63;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.documentwrapper {
|
||||||
|
float: left;
|
||||||
|
width: 100%;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.bodywrapper {
|
||||||
|
margin: 0 0 0 230px;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.body {
|
||||||
|
background-color: #ffffff;
|
||||||
|
color: #000000;
|
||||||
|
padding: 0 20px 30px 20px;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.footer {
|
||||||
|
color: #ffffff;
|
||||||
|
width: 100%;
|
||||||
|
padding: 9px 0 9px 0;
|
||||||
|
text-align: center;
|
||||||
|
font-size: 75%;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.footer a {
|
||||||
|
color: #ffffff;
|
||||||
|
text-decoration: underline;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.related {
|
||||||
|
background-color: #133f52;
|
||||||
|
line-height: 30px;
|
||||||
|
color: #ffffff;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.related a {
|
||||||
|
color: #ffffff;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar {
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar h3 {
|
||||||
|
font-family: 'Trebuchet MS', sans-serif;
|
||||||
|
color: #ffffff;
|
||||||
|
font-size: 1.4em;
|
||||||
|
font-weight: normal;
|
||||||
|
margin: 0;
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar h3 a {
|
||||||
|
color: #ffffff;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar h4 {
|
||||||
|
font-family: 'Trebuchet MS', sans-serif;
|
||||||
|
color: #ffffff;
|
||||||
|
font-size: 1.3em;
|
||||||
|
font-weight: normal;
|
||||||
|
margin: 5px 0 0 0;
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar p {
|
||||||
|
color: #ffffff;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar p.topless {
|
||||||
|
margin: 5px 10px 10px 10px;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar ul {
|
||||||
|
margin: 10px;
|
||||||
|
padding: 0;
|
||||||
|
color: #ffffff;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar a {
|
||||||
|
color: #98dbcc;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar input {
|
||||||
|
border: 1px solid #98dbcc;
|
||||||
|
font-family: sans-serif;
|
||||||
|
font-size: 1em;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- body styles ----------------------------------------------------------- */
|
||||||
|
|
||||||
|
a {
|
||||||
|
color: #355f7c;
|
||||||
|
text-decoration: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
a:hover {
|
||||||
|
text-decoration: underline;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.body p, div.body dd, div.body li {
|
||||||
|
text-align: left;
|
||||||
|
line-height: 130%;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.body h1,
|
||||||
|
div.body h2,
|
||||||
|
div.body h3,
|
||||||
|
div.body h4,
|
||||||
|
div.body h5,
|
||||||
|
div.body h6 {
|
||||||
|
font-family: 'Trebuchet MS', sans-serif;
|
||||||
|
background-color: #f2f2f2;
|
||||||
|
font-weight: normal;
|
||||||
|
color: #20435c;
|
||||||
|
border-bottom: 1px solid #ccc;
|
||||||
|
margin: 20px -20px 10px -20px;
|
||||||
|
padding: 3px 0 3px 10px;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.body h1 { margin-top: 0; font-size: 200%; }
|
||||||
|
div.body h2 { font-size: 160%; }
|
||||||
|
div.body h3 { font-size: 140%; }
|
||||||
|
div.body h4 { font-size: 120%; }
|
||||||
|
div.body h5 { font-size: 110%; }
|
||||||
|
div.body h6 { font-size: 100%; }
|
||||||
|
|
||||||
|
a.headerlink {
|
||||||
|
color: #c60f0f;
|
||||||
|
font-size: 0.8em;
|
||||||
|
padding: 0 4px 0 4px;
|
||||||
|
text-decoration: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
a.headerlink:hover {
|
||||||
|
background-color: #c60f0f;
|
||||||
|
color: white;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.body p, div.body dd, div.body li {
|
||||||
|
text-align: left;
|
||||||
|
line-height: 130%;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.admonition p.admonition-title + p {
|
||||||
|
display: inline;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.admonition p {
|
||||||
|
margin-bottom: 5px;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.admonition pre {
|
||||||
|
margin-bottom: 5px;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.admonition ul, div.admonition ol {
|
||||||
|
margin-bottom: 5px;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.note {
|
||||||
|
background-color: #eee;
|
||||||
|
border: 1px solid #ccc;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.seealso {
|
||||||
|
background-color: #ffc;
|
||||||
|
border: 1px solid #ff6;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.topic {
|
||||||
|
background-color: #eee;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.warning {
|
||||||
|
background-color: #ffe4e4;
|
||||||
|
border: 1px solid #f66;
|
||||||
|
}
|
||||||
|
|
||||||
|
p.admonition-title {
|
||||||
|
display: inline;
|
||||||
|
}
|
||||||
|
|
||||||
|
p.admonition-title:after {
|
||||||
|
content: ":";
|
||||||
|
}
|
||||||
|
|
||||||
|
pre {
|
||||||
|
padding: 5px;
|
||||||
|
background-color: #eeffcc;
|
||||||
|
color: #333333;
|
||||||
|
line-height: 120%;
|
||||||
|
border: 1px solid #ac9;
|
||||||
|
border-left: none;
|
||||||
|
border-right: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
tt {
|
||||||
|
background-color: #ecf0f3;
|
||||||
|
padding: 0 1px 0 1px;
|
||||||
|
font-size: 0.95em;
|
||||||
|
}
|
||||||
|
|
||||||
|
.warning tt {
|
||||||
|
background: #efc2c2;
|
||||||
|
}
|
||||||
|
|
||||||
|
.note tt {
|
||||||
|
background: #d6d6d6;
|
||||||
|
}
|
BIN
doc/source/static/header-line.gif
Normal file
BIN
doc/source/static/header-line.gif
Normal file
Binary file not shown.
After Width: | Height: | Size: 48 B |
BIN
doc/source/static/header_bg.jpg
Normal file
BIN
doc/source/static/header_bg.jpg
Normal file
Binary file not shown.
After Width: | Height: | Size: 3.7 KiB |
154
doc/source/static/jquery.tweet.js
Normal file
154
doc/source/static/jquery.tweet.js
Normal file
@ -0,0 +1,154 @@
|
|||||||
|
(function($) {
|
||||||
|
|
||||||
|
$.fn.tweet = function(o){
|
||||||
|
var s = {
|
||||||
|
username: ["seaofclouds"], // [string] required, unless you want to display our tweets. :) it can be an array, just do ["username1","username2","etc"]
|
||||||
|
list: null, //[string] optional name of list belonging to username
|
||||||
|
avatar_size: null, // [integer] height and width of avatar if displayed (48px max)
|
||||||
|
count: 3, // [integer] how many tweets to display?
|
||||||
|
intro_text: null, // [string] do you want text BEFORE your your tweets?
|
||||||
|
outro_text: null, // [string] do you want text AFTER your tweets?
|
||||||
|
join_text: null, // [string] optional text in between date and tweet, try setting to "auto"
|
||||||
|
auto_join_text_default: "i said,", // [string] auto text for non verb: "i said" bullocks
|
||||||
|
auto_join_text_ed: "i", // [string] auto text for past tense: "i" surfed
|
||||||
|
auto_join_text_ing: "i am", // [string] auto tense for present tense: "i was" surfing
|
||||||
|
auto_join_text_reply: "i replied to", // [string] auto tense for replies: "i replied to" @someone "with"
|
||||||
|
auto_join_text_url: "i was looking at", // [string] auto tense for urls: "i was looking at" http:...
|
||||||
|
loading_text: null, // [string] optional loading text, displayed while tweets load
|
||||||
|
query: null // [string] optional search query
|
||||||
|
};
|
||||||
|
|
||||||
|
if(o) $.extend(s, o);
|
||||||
|
|
||||||
|
$.fn.extend({
|
||||||
|
linkUrl: function() {
|
||||||
|
var returning = [];
|
||||||
|
var regexp = /((ftp|http|https):\/\/(\w+:{0,1}\w*@)?(\S+)(:[0-9]+)?(\/|\/([\w#!:.?+=&%@!\-\/]))?)/gi;
|
||||||
|
this.each(function() {
|
||||||
|
returning.push(this.replace(regexp,"<a href=\"$1\">$1</a>"));
|
||||||
|
});
|
||||||
|
return $(returning);
|
||||||
|
},
|
||||||
|
linkUser: function() {
|
||||||
|
var returning = [];
|
||||||
|
var regexp = /[\@]+([A-Za-z0-9-_]+)/gi;
|
||||||
|
this.each(function() {
|
||||||
|
returning.push(this.replace(regexp,"<a href=\"http://twitter.com/$1\">@$1</a>"));
|
||||||
|
});
|
||||||
|
return $(returning);
|
||||||
|
},
|
||||||
|
linkHash: function() {
|
||||||
|
var returning = [];
|
||||||
|
var regexp = / [\#]+([A-Za-z0-9-_]+)/gi;
|
||||||
|
this.each(function() {
|
||||||
|
returning.push(this.replace(regexp, ' <a href="http://search.twitter.com/search?q=&tag=$1&lang=all&from='+s.username.join("%2BOR%2B")+'">#$1</a>'));
|
||||||
|
});
|
||||||
|
return $(returning);
|
||||||
|
},
|
||||||
|
capAwesome: function() {
|
||||||
|
var returning = [];
|
||||||
|
this.each(function() {
|
||||||
|
returning.push(this.replace(/\b(awesome)\b/gi, '<span class="awesome">$1</span>'));
|
||||||
|
});
|
||||||
|
return $(returning);
|
||||||
|
},
|
||||||
|
capEpic: function() {
|
||||||
|
var returning = [];
|
||||||
|
this.each(function() {
|
||||||
|
returning.push(this.replace(/\b(epic)\b/gi, '<span class="epic">$1</span>'));
|
||||||
|
});
|
||||||
|
return $(returning);
|
||||||
|
},
|
||||||
|
makeHeart: function() {
|
||||||
|
var returning = [];
|
||||||
|
this.each(function() {
|
||||||
|
returning.push(this.replace(/(<)+[3]/gi, "<tt class='heart'>♥</tt>"));
|
||||||
|
});
|
||||||
|
return $(returning);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
function relative_time(time_value) {
|
||||||
|
var parsed_date = Date.parse(time_value);
|
||||||
|
var relative_to = (arguments.length > 1) ? arguments[1] : new Date();
|
||||||
|
var delta = parseInt((relative_to.getTime() - parsed_date) / 1000);
|
||||||
|
var pluralize = function (singular, n) {
|
||||||
|
return '' + n + ' ' + singular + (n == 1 ? '' : 's');
|
||||||
|
};
|
||||||
|
if(delta < 60) {
|
||||||
|
return 'less than a minute ago';
|
||||||
|
} else if(delta < (45*60)) {
|
||||||
|
return 'about ' + pluralize("minute", parseInt(delta / 60)) + ' ago';
|
||||||
|
} else if(delta < (24*60*60)) {
|
||||||
|
return 'about ' + pluralize("hour", parseInt(delta / 3600)) + ' ago';
|
||||||
|
} else {
|
||||||
|
return 'about ' + pluralize("day", parseInt(delta / 86400)) + ' ago';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function build_url() {
|
||||||
|
var proto = ('https:' == document.location.protocol ? 'https:' : 'http:');
|
||||||
|
if (s.list) {
|
||||||
|
return proto+"//api.twitter.com/1/"+s.username[0]+"/lists/"+s.list+"/statuses.json?per_page="+s.count+"&callback=?";
|
||||||
|
} else if (s.query == null && s.username.length == 1) {
|
||||||
|
return proto+'//twitter.com/status/user_timeline/'+s.username[0]+'.json?count='+s.count+'&callback=?';
|
||||||
|
} else {
|
||||||
|
var query = (s.query || 'from:'+s.username.join('%20OR%20from:'));
|
||||||
|
return proto+'//search.twitter.com/search.json?&q='+query+'&rpp='+s.count+'&callback=?';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.each(function(){
|
||||||
|
var list = $('<ul class="tweet_list">').appendTo(this);
|
||||||
|
var intro = '<p class="tweet_intro">'+s.intro_text+'</p>';
|
||||||
|
var outro = '<p class="tweet_outro">'+s.outro_text+'</p>';
|
||||||
|
var loading = $('<p class="loading">'+s.loading_text+'</p>');
|
||||||
|
|
||||||
|
if(typeof(s.username) == "string"){
|
||||||
|
s.username = [s.username];
|
||||||
|
}
|
||||||
|
|
||||||
|
if (s.loading_text) $(this).append(loading);
|
||||||
|
$.getJSON(build_url(), function(data){
|
||||||
|
if (s.loading_text) loading.remove();
|
||||||
|
if (s.intro_text) list.before(intro);
|
||||||
|
$.each((data.results || data), function(i,item){
|
||||||
|
// auto join text based on verb tense and content
|
||||||
|
if (s.join_text == "auto") {
|
||||||
|
if (item.text.match(/^(@([A-Za-z0-9-_]+)) .*/i)) {
|
||||||
|
var join_text = s.auto_join_text_reply;
|
||||||
|
} else if (item.text.match(/(^\w+:\/\/[A-Za-z0-9-_]+\.[A-Za-z0-9-_:%&\?\/.=]+) .*/i)) {
|
||||||
|
var join_text = s.auto_join_text_url;
|
||||||
|
} else if (item.text.match(/^((\w+ed)|just) .*/im)) {
|
||||||
|
var join_text = s.auto_join_text_ed;
|
||||||
|
} else if (item.text.match(/^(\w*ing) .*/i)) {
|
||||||
|
var join_text = s.auto_join_text_ing;
|
||||||
|
} else {
|
||||||
|
var join_text = s.auto_join_text_default;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
var join_text = s.join_text;
|
||||||
|
};
|
||||||
|
|
||||||
|
var from_user = item.from_user || item.user.screen_name;
|
||||||
|
var profile_image_url = item.profile_image_url || item.user.profile_image_url;
|
||||||
|
var join_template = '<span class="tweet_join"> '+join_text+' </span>';
|
||||||
|
var join = ((s.join_text) ? join_template : ' ');
|
||||||
|
var avatar_template = '<a class="tweet_avatar" href="http://twitter.com/'+from_user+'"><img src="'+profile_image_url+'" height="'+s.avatar_size+'" width="'+s.avatar_size+'" alt="'+from_user+'\'s avatar" title="'+from_user+'\'s avatar" border="0"/></a>';
|
||||||
|
var avatar = (s.avatar_size ? avatar_template : '');
|
||||||
|
var date = '<a href="http://twitter.com/'+from_user+'/statuses/'+item.id+'" title="view tweet on twitter">'+relative_time(item.created_at)+'</a>';
|
||||||
|
var text = '<span class="tweet_text">' +$([item.text]).linkUrl().linkUser().linkHash().makeHeart().capAwesome().capEpic()[0]+ '</span>';
|
||||||
|
|
||||||
|
// until we create a template option, arrange the items below to alter a tweet's display.
|
||||||
|
list.append('<li>' + avatar + date + join + text + '</li>');
|
||||||
|
|
||||||
|
list.children('li:first').addClass('tweet_first');
|
||||||
|
list.children('li:odd').addClass('tweet_even');
|
||||||
|
list.children('li:even').addClass('tweet_odd');
|
||||||
|
});
|
||||||
|
if (s.outro_text) list.after(outro);
|
||||||
|
});
|
||||||
|
|
||||||
|
});
|
||||||
|
};
|
||||||
|
})(jQuery);
|
245
doc/source/static/nature.css
Normal file
245
doc/source/static/nature.css
Normal file
@ -0,0 +1,245 @@
|
|||||||
|
/*
|
||||||
|
* nature.css_t
|
||||||
|
* ~~~~~~~~~~~~
|
||||||
|
*
|
||||||
|
* Sphinx stylesheet -- nature theme.
|
||||||
|
*
|
||||||
|
* :copyright: Copyright 2007-2011 by the Sphinx team, see AUTHORS.
|
||||||
|
* :license: BSD, see LICENSE for details.
|
||||||
|
*
|
||||||
|
*/
|
||||||
|
|
||||||
|
@import url("basic.css");
|
||||||
|
|
||||||
|
/* -- page layout ----------------------------------------------------------- */
|
||||||
|
|
||||||
|
body {
|
||||||
|
font-family: Arial, sans-serif;
|
||||||
|
font-size: 100%;
|
||||||
|
background-color: #111;
|
||||||
|
color: #555;
|
||||||
|
margin: 0;
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.documentwrapper {
|
||||||
|
float: left;
|
||||||
|
width: 100%;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.bodywrapper {
|
||||||
|
margin: 0 0 0 {{ theme_sidebarwidth|toint }}px;
|
||||||
|
}
|
||||||
|
|
||||||
|
hr {
|
||||||
|
border: 1px solid #B1B4B6;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.document {
|
||||||
|
background-color: #eee;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.body {
|
||||||
|
background-color: #ffffff;
|
||||||
|
color: #3E4349;
|
||||||
|
padding: 0 30px 30px 30px;
|
||||||
|
font-size: 0.9em;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.footer {
|
||||||
|
color: #555;
|
||||||
|
width: 100%;
|
||||||
|
padding: 13px 0;
|
||||||
|
text-align: center;
|
||||||
|
font-size: 75%;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.footer a {
|
||||||
|
color: #444;
|
||||||
|
text-decoration: underline;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.related {
|
||||||
|
background-color: #6BA81E;
|
||||||
|
line-height: 32px;
|
||||||
|
color: #fff;
|
||||||
|
text-shadow: 0px 1px 0 #444;
|
||||||
|
font-size: 0.9em;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.related a {
|
||||||
|
color: #E2F3CC;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar {
|
||||||
|
font-size: 0.75em;
|
||||||
|
line-height: 1.5em;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebarwrapper{
|
||||||
|
padding: 20px 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar h3,
|
||||||
|
div.sphinxsidebar h4 {
|
||||||
|
font-family: Arial, sans-serif;
|
||||||
|
color: #222;
|
||||||
|
font-size: 1.2em;
|
||||||
|
font-weight: normal;
|
||||||
|
margin: 0;
|
||||||
|
padding: 5px 10px;
|
||||||
|
background-color: #ddd;
|
||||||
|
text-shadow: 1px 1px 0 white
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar h4{
|
||||||
|
font-size: 1.1em;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar h3 a {
|
||||||
|
color: #444;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
div.sphinxsidebar p {
|
||||||
|
color: #888;
|
||||||
|
padding: 5px 20px;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar p.topless {
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar ul {
|
||||||
|
margin: 10px 20px;
|
||||||
|
padding: 0;
|
||||||
|
color: #000;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar a {
|
||||||
|
color: #444;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar input {
|
||||||
|
border: 1px solid #ccc;
|
||||||
|
font-family: sans-serif;
|
||||||
|
font-size: 1em;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebar input[type=text]{
|
||||||
|
margin-left: 20px;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* -- body styles ----------------------------------------------------------- */
|
||||||
|
|
||||||
|
a {
|
||||||
|
color: #005B81;
|
||||||
|
text-decoration: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
a:hover {
|
||||||
|
color: #E32E00;
|
||||||
|
text-decoration: underline;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.body h1,
|
||||||
|
div.body h2,
|
||||||
|
div.body h3,
|
||||||
|
div.body h4,
|
||||||
|
div.body h5,
|
||||||
|
div.body h6 {
|
||||||
|
font-family: Arial, sans-serif;
|
||||||
|
background-color: #BED4EB;
|
||||||
|
font-weight: normal;
|
||||||
|
color: #212224;
|
||||||
|
margin: 30px 0px 10px 0px;
|
||||||
|
padding: 5px 0 5px 10px;
|
||||||
|
text-shadow: 0px 1px 0 white
|
||||||
|
}
|
||||||
|
|
||||||
|
div.body h1 { border-top: 20px solid white; margin-top: 0; font-size: 200%; }
|
||||||
|
div.body h2 { font-size: 150%; background-color: #C8D5E3; }
|
||||||
|
div.body h3 { font-size: 120%; background-color: #D8DEE3; }
|
||||||
|
div.body h4 { font-size: 110%; background-color: #D8DEE3; }
|
||||||
|
div.body h5 { font-size: 100%; background-color: #D8DEE3; }
|
||||||
|
div.body h6 { font-size: 100%; background-color: #D8DEE3; }
|
||||||
|
|
||||||
|
a.headerlink {
|
||||||
|
color: #c60f0f;
|
||||||
|
font-size: 0.8em;
|
||||||
|
padding: 0 4px 0 4px;
|
||||||
|
text-decoration: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
a.headerlink:hover {
|
||||||
|
background-color: #c60f0f;
|
||||||
|
color: white;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.body p, div.body dd, div.body li {
|
||||||
|
line-height: 1.5em;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.admonition p.admonition-title + p {
|
||||||
|
display: inline;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.highlight{
|
||||||
|
background-color: white;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.note {
|
||||||
|
background-color: #eee;
|
||||||
|
border: 1px solid #ccc;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.seealso {
|
||||||
|
background-color: #ffc;
|
||||||
|
border: 1px solid #ff6;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.topic {
|
||||||
|
background-color: #eee;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.warning {
|
||||||
|
background-color: #ffe4e4;
|
||||||
|
border: 1px solid #f66;
|
||||||
|
}
|
||||||
|
|
||||||
|
p.admonition-title {
|
||||||
|
display: inline;
|
||||||
|
}
|
||||||
|
|
||||||
|
p.admonition-title:after {
|
||||||
|
content: ":";
|
||||||
|
}
|
||||||
|
|
||||||
|
pre {
|
||||||
|
padding: 10px;
|
||||||
|
background-color: White;
|
||||||
|
color: #222;
|
||||||
|
line-height: 1.2em;
|
||||||
|
border: 1px solid #C6C9CB;
|
||||||
|
font-size: 1.1em;
|
||||||
|
margin: 1.5em 0 1.5em 0;
|
||||||
|
-webkit-box-shadow: 1px 1px 1px #d8d8d8;
|
||||||
|
-moz-box-shadow: 1px 1px 1px #d8d8d8;
|
||||||
|
}
|
||||||
|
|
||||||
|
tt {
|
||||||
|
background-color: #ecf0f3;
|
||||||
|
color: #222;
|
||||||
|
/* padding: 1px 2px; */
|
||||||
|
font-size: 1.1em;
|
||||||
|
font-family: monospace;
|
||||||
|
}
|
||||||
|
|
||||||
|
.viewcode-back {
|
||||||
|
font-family: Arial, sans-serif;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.viewcode-block:target {
|
||||||
|
background-color: #f4debf;
|
||||||
|
border-top: 1px solid #ac9;
|
||||||
|
border-bottom: 1px solid #ac9;
|
||||||
|
}
|
BIN
doc/source/static/openstack_logo.png
Normal file
BIN
doc/source/static/openstack_logo.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 3.6 KiB |
94
doc/source/static/tweaks.css
Normal file
94
doc/source/static/tweaks.css
Normal file
@ -0,0 +1,94 @@
|
|||||||
|
body {
|
||||||
|
background: #fff url(../_static/header_bg.jpg) top left no-repeat;
|
||||||
|
}
|
||||||
|
|
||||||
|
#header {
|
||||||
|
width: 950px;
|
||||||
|
margin: 0 auto;
|
||||||
|
height: 102px;
|
||||||
|
}
|
||||||
|
|
||||||
|
#header h1#logo {
|
||||||
|
background: url(../_static/openstack_logo.png) top left no-repeat;
|
||||||
|
display: block;
|
||||||
|
float: left;
|
||||||
|
text-indent: -9999px;
|
||||||
|
width: 175px;
|
||||||
|
height: 55px;
|
||||||
|
}
|
||||||
|
|
||||||
|
#navigation {
|
||||||
|
background: url(../_static/header-line.gif) repeat-x 0 bottom;
|
||||||
|
display: block;
|
||||||
|
float: left;
|
||||||
|
margin: 27px 0 0 25px;
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
#navigation li{
|
||||||
|
float: left;
|
||||||
|
display: block;
|
||||||
|
margin-right: 25px;
|
||||||
|
}
|
||||||
|
|
||||||
|
#navigation li a {
|
||||||
|
display: block;
|
||||||
|
font-weight: normal;
|
||||||
|
text-decoration: none;
|
||||||
|
background-position: 50% 0;
|
||||||
|
padding: 20px 0 5px;
|
||||||
|
color: #353535;
|
||||||
|
font-size: 14px;
|
||||||
|
}
|
||||||
|
|
||||||
|
#navigation li a.current, #navigation li a.section {
|
||||||
|
border-bottom: 3px solid #cf2f19;
|
||||||
|
color: #cf2f19;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.related {
|
||||||
|
background-color: #cde2f8;
|
||||||
|
border: 1px solid #b0d3f8;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.related a {
|
||||||
|
color: #4078ba;
|
||||||
|
text-shadow: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.sphinxsidebarwrapper {
|
||||||
|
padding-top: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
pre {
|
||||||
|
color: #555;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.documentwrapper h1, div.documentwrapper h2, div.documentwrapper h3, div.documentwrapper h4, div.documentwrapper h5, div.documentwrapper h6 {
|
||||||
|
font-family: 'PT Sans', sans-serif !important;
|
||||||
|
color: #264D69;
|
||||||
|
border-bottom: 1px dotted #C5E2EA;
|
||||||
|
padding: 0;
|
||||||
|
background: none;
|
||||||
|
padding-bottom: 5px;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.documentwrapper h3 {
|
||||||
|
color: #CF2F19;
|
||||||
|
}
|
||||||
|
|
||||||
|
a.headerlink {
|
||||||
|
color: #fff !important;
|
||||||
|
margin-left: 5px;
|
||||||
|
background: #CF2F19 !important;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.body {
|
||||||
|
margin-top: -25px;
|
||||||
|
margin-left: 230px;
|
||||||
|
}
|
||||||
|
|
||||||
|
div.document {
|
||||||
|
width: 960px;
|
||||||
|
margin: 0 auto;
|
||||||
|
}
|
@ -1,42 +0,0 @@
|
|||||||
#!/bin/sh
|
|
||||||
|
|
||||||
# Copyright 2011 OpenStack Foundation
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
PRE_COMMIT_SCRIPT=.git/hooks/pre-commit
|
|
||||||
|
|
||||||
make_hook() {
|
|
||||||
echo "exec ./run_tests.sh -N -p" >> $PRE_COMMIT_SCRIPT
|
|
||||||
chmod +x $PRE_COMMIT_SCRIPT
|
|
||||||
|
|
||||||
if [ -w $PRE_COMMIT_SCRIPT -a -x $PRE_COMMIT_SCRIPT ]; then
|
|
||||||
echo "pre-commit hook was created successfully"
|
|
||||||
else
|
|
||||||
echo "unable to create pre-commit hook"
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
# NOTE(jk0): Make sure we are in nova's root directory before adding the hook.
|
|
||||||
if [ ! -d ".git" ]; then
|
|
||||||
echo "unable to find .git; moving up a directory"
|
|
||||||
cd ..
|
|
||||||
if [ -d ".git" ]; then
|
|
||||||
make_hook
|
|
||||||
else
|
|
||||||
echo "still unable to find .git; hook not created"
|
|
||||||
fi
|
|
||||||
else
|
|
||||||
make_hook
|
|
||||||
fi
|
|
||||||
|
|
@ -1,403 +0,0 @@
|
|||||||
# vim: tabstop=4 shiftwidth=4 softtabstop=4
|
|
||||||
|
|
||||||
# Copyright (c) 2011 Citrix Systems, Inc.
|
|
||||||
# Copyright 2011 OpenStack Foundation
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
"""
|
|
||||||
Guest tools for ESX to set up network in the guest.
|
|
||||||
On Windows we require pyWin32 installed on Python.
|
|
||||||
"""
|
|
||||||
|
|
||||||
import array
|
|
||||||
import gettext
|
|
||||||
import logging
|
|
||||||
import os
|
|
||||||
import platform
|
|
||||||
import socket
|
|
||||||
import struct
|
|
||||||
import subprocess
|
|
||||||
import sys
|
|
||||||
import time
|
|
||||||
|
|
||||||
gettext.install('nova', unicode=1)
|
|
||||||
|
|
||||||
PLATFORM_WIN = 'win32'
|
|
||||||
PLATFORM_LINUX = 'linux2'
|
|
||||||
ARCH_32_BIT = '32bit'
|
|
||||||
ARCH_64_BIT = '64bit'
|
|
||||||
NO_MACHINE_ID = 'No machine id'
|
|
||||||
|
|
||||||
# Logging
|
|
||||||
FORMAT = "%(asctime)s - %(levelname)s - %(message)s"
|
|
||||||
if sys.platform == PLATFORM_WIN:
|
|
||||||
LOG_DIR = os.path.join(os.environ.get('ALLUSERSPROFILE'), 'openstack')
|
|
||||||
elif sys.platform == PLATFORM_LINUX:
|
|
||||||
LOG_DIR = '/var/log/openstack'
|
|
||||||
else:
|
|
||||||
LOG_DIR = 'logs'
|
|
||||||
if not os.path.exists(LOG_DIR):
|
|
||||||
os.mkdir(LOG_DIR)
|
|
||||||
LOG_FILENAME = os.path.join(LOG_DIR, 'openstack-guest-tools.log')
|
|
||||||
logging.basicConfig(filename=LOG_FILENAME, format=FORMAT)
|
|
||||||
|
|
||||||
if sys.hexversion < 0x3000000:
|
|
||||||
_byte = ord # 2.x chr to integer
|
|
||||||
else:
|
|
||||||
_byte = int # 3.x byte to integer
|
|
||||||
|
|
||||||
|
|
||||||
class ProcessExecutionError:
|
|
||||||
"""Process Execution Error Class."""
|
|
||||||
|
|
||||||
def __init__(self, exit_code, stdout, stderr, cmd):
|
|
||||||
self.exit_code = exit_code
|
|
||||||
self.stdout = stdout
|
|
||||||
self.stderr = stderr
|
|
||||||
self.cmd = cmd
|
|
||||||
|
|
||||||
def __str__(self):
|
|
||||||
return str(self.exit_code)
|
|
||||||
|
|
||||||
|
|
||||||
def _bytes2int(bytes):
|
|
||||||
"""Convert bytes to int."""
|
|
||||||
intgr = 0
|
|
||||||
for byt in bytes:
|
|
||||||
intgr = (intgr << 8) + _byte(byt)
|
|
||||||
return intgr
|
|
||||||
|
|
||||||
|
|
||||||
def _parse_network_details(machine_id):
|
|
||||||
"""
|
|
||||||
Parse the machine_id to get MAC, IP, Netmask and Gateway fields per NIC.
|
|
||||||
machine_id is of the form ('NIC_record#NIC_record#', '')
|
|
||||||
Each of the NIC will have record NIC_record in the form
|
|
||||||
'MAC;IP;Netmask;Gateway;Broadcast;DNS' where ';' is field separator.
|
|
||||||
Each record is separated by '#' from next record.
|
|
||||||
"""
|
|
||||||
logging.debug(_("Received machine_id from vmtools : %s") % machine_id[0])
|
|
||||||
network_details = []
|
|
||||||
if machine_id[1].strip() == "1":
|
|
||||||
pass
|
|
||||||
else:
|
|
||||||
for machine_id_str in machine_id[0].split('#'):
|
|
||||||
network_info_list = machine_id_str.split(';')
|
|
||||||
if len(network_info_list) % 6 != 0:
|
|
||||||
break
|
|
||||||
no_grps = len(network_info_list) / 6
|
|
||||||
i = 0
|
|
||||||
while i < no_grps:
|
|
||||||
k = i * 6
|
|
||||||
network_details.append((
|
|
||||||
network_info_list[k].strip().lower(),
|
|
||||||
network_info_list[k + 1].strip(),
|
|
||||||
network_info_list[k + 2].strip(),
|
|
||||||
network_info_list[k + 3].strip(),
|
|
||||||
network_info_list[k + 4].strip(),
|
|
||||||
network_info_list[k + 5].strip().split(',')))
|
|
||||||
i += 1
|
|
||||||
logging.debug(_("NIC information from vmtools : %s") % network_details)
|
|
||||||
return network_details
|
|
||||||
|
|
||||||
|
|
||||||
def _get_windows_network_adapters():
|
|
||||||
"""Get the list of windows network adapters."""
|
|
||||||
import win32com.client
|
|
||||||
wbem_locator = win32com.client.Dispatch('WbemScripting.SWbemLocator')
|
|
||||||
wbem_service = wbem_locator.ConnectServer('.', 'root\cimv2')
|
|
||||||
wbem_network_adapters = wbem_service.InstancesOf('Win32_NetworkAdapter')
|
|
||||||
network_adapters = []
|
|
||||||
for adapter in wbem_network_adapters:
|
|
||||||
if (adapter.NetConnectionStatus == 2 or
|
|
||||||
adapter.NetConnectionStatus == 7):
|
|
||||||
adapter_name = adapter.NetConnectionID
|
|
||||||
mac_address = adapter.MacAddress.lower()
|
|
||||||
config = adapter.associators_(
|
|
||||||
'Win32_NetworkAdapterSetting',
|
|
||||||
'Win32_NetworkAdapterConfiguration')[0]
|
|
||||||
ip_address = ''
|
|
||||||
subnet_mask = ''
|
|
||||||
if config.IPEnabled:
|
|
||||||
ip_address = config.IPAddress[0]
|
|
||||||
subnet_mask = config.IPSubnet[0]
|
|
||||||
#config.DefaultIPGateway[0]
|
|
||||||
network_adapters.append({'name': adapter_name,
|
|
||||||
'mac-address': mac_address,
|
|
||||||
'ip-address': ip_address,
|
|
||||||
'subnet-mask': subnet_mask})
|
|
||||||
return network_adapters
|
|
||||||
|
|
||||||
|
|
||||||
def _get_linux_network_adapters():
|
|
||||||
"""Get the list of Linux network adapters."""
|
|
||||||
import fcntl
|
|
||||||
max_bytes = 8096
|
|
||||||
arch = platform.architecture()[0]
|
|
||||||
if arch == ARCH_32_BIT:
|
|
||||||
offset1 = 32
|
|
||||||
offset2 = 32
|
|
||||||
elif arch == ARCH_64_BIT:
|
|
||||||
offset1 = 16
|
|
||||||
offset2 = 40
|
|
||||||
else:
|
|
||||||
raise OSError(_("Unknown architecture: %s") % arch)
|
|
||||||
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
|
||||||
names = array.array('B', '\0' * max_bytes)
|
|
||||||
outbytes = struct.unpack('iL', fcntl.ioctl(
|
|
||||||
sock.fileno(),
|
|
||||||
0x8912,
|
|
||||||
struct.pack('iL', max_bytes, names.buffer_info()[0])))[0]
|
|
||||||
adapter_names = [names.tostring()[n_cnt:n_cnt + offset1].split('\0', 1)[0]
|
|
||||||
for n_cnt in xrange(0, outbytes, offset2)]
|
|
||||||
network_adapters = []
|
|
||||||
for adapter_name in adapter_names:
|
|
||||||
ip_address = socket.inet_ntoa(fcntl.ioctl(
|
|
||||||
sock.fileno(),
|
|
||||||
0x8915,
|
|
||||||
struct.pack('256s', adapter_name))[20:24])
|
|
||||||
subnet_mask = socket.inet_ntoa(fcntl.ioctl(
|
|
||||||
sock.fileno(),
|
|
||||||
0x891b,
|
|
||||||
struct.pack('256s', adapter_name))[20:24])
|
|
||||||
raw_mac_address = '%012x' % _bytes2int(fcntl.ioctl(
|
|
||||||
sock.fileno(),
|
|
||||||
0x8927,
|
|
||||||
struct.pack('256s', adapter_name))[18:24])
|
|
||||||
mac_address = ":".join([raw_mac_address[m_counter:m_counter + 2]
|
|
||||||
for m_counter in range(0, len(raw_mac_address), 2)]).lower()
|
|
||||||
network_adapters.append({'name': adapter_name,
|
|
||||||
'mac-address': mac_address,
|
|
||||||
'ip-address': ip_address,
|
|
||||||
'subnet-mask': subnet_mask})
|
|
||||||
return network_adapters
|
|
||||||
|
|
||||||
|
|
||||||
def _get_adapter_name_and_ip_address(network_adapters, mac_address):
|
|
||||||
"""Get the adapter name based on the MAC address."""
|
|
||||||
adapter_name = None
|
|
||||||
ip_address = None
|
|
||||||
for network_adapter in network_adapters:
|
|
||||||
if network_adapter['mac-address'] == mac_address.lower():
|
|
||||||
adapter_name = network_adapter['name']
|
|
||||||
ip_address = network_adapter['ip-address']
|
|
||||||
break
|
|
||||||
return adapter_name, ip_address
|
|
||||||
|
|
||||||
|
|
||||||
def _get_win_adapter_name_and_ip_address(mac_address):
|
|
||||||
"""Get Windows network adapter name."""
|
|
||||||
network_adapters = _get_windows_network_adapters()
|
|
||||||
return _get_adapter_name_and_ip_address(network_adapters, mac_address)
|
|
||||||
|
|
||||||
|
|
||||||
def _get_linux_adapter_name_and_ip_address(mac_address):
|
|
||||||
"""Get Linux network adapter name."""
|
|
||||||
network_adapters = _get_linux_network_adapters()
|
|
||||||
return _get_adapter_name_and_ip_address(network_adapters, mac_address)
|
|
||||||
|
|
||||||
|
|
||||||
def _execute(cmd_list, process_input=None, check_exit_code=True):
|
|
||||||
"""Executes the command with the list of arguments specified."""
|
|
||||||
cmd = ' '.join(cmd_list)
|
|
||||||
logging.debug(_("Executing command: '%s'") % cmd)
|
|
||||||
env = os.environ.copy()
|
|
||||||
obj = subprocess.Popen(cmd, shell=True, stdin=subprocess.PIPE,
|
|
||||||
stdout=subprocess.PIPE, stderr=subprocess.PIPE, env=env)
|
|
||||||
result = None
|
|
||||||
if process_input is not None:
|
|
||||||
result = obj.communicate(process_input)
|
|
||||||
else:
|
|
||||||
result = obj.communicate()
|
|
||||||
obj.stdin.close()
|
|
||||||
if obj.returncode:
|
|
||||||
logging.debug(_("Result was %s") % obj.returncode)
|
|
||||||
if check_exit_code and obj.returncode != 0:
|
|
||||||
(stdout, stderr) = result
|
|
||||||
raise ProcessExecutionError(exit_code=obj.returncode,
|
|
||||||
stdout=stdout,
|
|
||||||
stderr=stderr,
|
|
||||||
cmd=cmd)
|
|
||||||
time.sleep(0.1)
|
|
||||||
return result
|
|
||||||
|
|
||||||
|
|
||||||
def _windows_set_networking():
|
|
||||||
"""Set IP address for the windows VM."""
|
|
||||||
program_files = os.environ.get('PROGRAMFILES')
|
|
||||||
program_files_x86 = os.environ.get('PROGRAMFILES(X86)')
|
|
||||||
vmware_tools_bin = None
|
|
||||||
if os.path.exists(os.path.join(program_files, 'VMware', 'VMware Tools',
|
|
||||||
'vmtoolsd.exe')):
|
|
||||||
vmware_tools_bin = os.path.join(program_files, 'VMware',
|
|
||||||
'VMware Tools', 'vmtoolsd.exe')
|
|
||||||
elif os.path.exists(os.path.join(program_files, 'VMware', 'VMware Tools',
|
|
||||||
'VMwareService.exe')):
|
|
||||||
vmware_tools_bin = os.path.join(program_files, 'VMware',
|
|
||||||
'VMware Tools', 'VMwareService.exe')
|
|
||||||
elif program_files_x86 and os.path.exists(os.path.join(program_files_x86,
|
|
||||||
'VMware', 'VMware Tools',
|
|
||||||
'VMwareService.exe')):
|
|
||||||
vmware_tools_bin = os.path.join(program_files_x86, 'VMware',
|
|
||||||
'VMware Tools', 'VMwareService.exe')
|
|
||||||
if vmware_tools_bin:
|
|
||||||
cmd = ['"' + vmware_tools_bin + '"', '--cmd', 'machine.id.get']
|
|
||||||
for network_detail in _parse_network_details(_execute(cmd,
|
|
||||||
check_exit_code=False)):
|
|
||||||
(mac_address, ip_address, subnet_mask, gateway, broadcast,
|
|
||||||
dns_servers) = network_detail
|
|
||||||
name_and_ip = _get_win_adapter_name_and_ip_address(mac_address)
|
|
||||||
adapter_name, current_ip_address = name_and_ip
|
|
||||||
if adapter_name and not ip_address == current_ip_address:
|
|
||||||
cmd = ['netsh', 'interface', 'ip', 'set', 'address',
|
|
||||||
'name="%s"' % adapter_name, 'source=static', ip_address,
|
|
||||||
subnet_mask, gateway, '1']
|
|
||||||
_execute(cmd)
|
|
||||||
# Windows doesn't let you manually set the broadcast address
|
|
||||||
for dns_server in dns_servers:
|
|
||||||
if dns_server:
|
|
||||||
cmd = ['netsh', 'interface', 'ip', 'add', 'dns',
|
|
||||||
'name="%s"' % adapter_name, dns_server]
|
|
||||||
_execute(cmd)
|
|
||||||
else:
|
|
||||||
logging.warn(_("VMware Tools is not installed"))
|
|
||||||
|
|
||||||
|
|
||||||
def _filter_duplicates(all_entries):
|
|
||||||
final_list = []
|
|
||||||
for entry in all_entries:
|
|
||||||
if entry and entry not in final_list:
|
|
||||||
final_list.append(entry)
|
|
||||||
return final_list
|
|
||||||
|
|
||||||
|
|
||||||
def _set_rhel_networking(network_details=None):
|
|
||||||
"""Set IPv4 network settings for RHEL distros."""
|
|
||||||
network_details = network_details or []
|
|
||||||
all_dns_servers = []
|
|
||||||
for network_detail in network_details:
|
|
||||||
(mac_address, ip_address, subnet_mask, gateway, broadcast,
|
|
||||||
dns_servers) = network_detail
|
|
||||||
all_dns_servers.extend(dns_servers)
|
|
||||||
name_and_ip = _get_linux_adapter_name_and_ip_address(mac_address)
|
|
||||||
adapter_name, current_ip_address = name_and_ip
|
|
||||||
if adapter_name and not ip_address == current_ip_address:
|
|
||||||
interface_file_name = ('/etc/sysconfig/network-scripts/ifcfg-%s' %
|
|
||||||
adapter_name)
|
|
||||||
# Remove file
|
|
||||||
os.remove(interface_file_name)
|
|
||||||
# Touch file
|
|
||||||
_execute(['touch', interface_file_name])
|
|
||||||
interface_file = open(interface_file_name, 'w')
|
|
||||||
interface_file.write('\nDEVICE=%s' % adapter_name)
|
|
||||||
interface_file.write('\nUSERCTL=yes')
|
|
||||||
interface_file.write('\nONBOOT=yes')
|
|
||||||
interface_file.write('\nBOOTPROTO=static')
|
|
||||||
interface_file.write('\nBROADCAST=%s' % broadcast)
|
|
||||||
interface_file.write('\nNETWORK=')
|
|
||||||
interface_file.write('\nGATEWAY=%s' % gateway)
|
|
||||||
interface_file.write('\nNETMASK=%s' % subnet_mask)
|
|
||||||
interface_file.write('\nIPADDR=%s' % ip_address)
|
|
||||||
interface_file.write('\nMACADDR=%s' % mac_address)
|
|
||||||
interface_file.close()
|
|
||||||
if all_dns_servers:
|
|
||||||
dns_file_name = "/etc/resolv.conf"
|
|
||||||
os.remove(dns_file_name)
|
|
||||||
_execute(['touch', dns_file_name])
|
|
||||||
dns_file = open(dns_file_name, 'w')
|
|
||||||
dns_file.write("; generated by OpenStack guest tools")
|
|
||||||
unique_entries = _filter_duplicates(all_dns_servers)
|
|
||||||
for dns_server in unique_entries:
|
|
||||||
dns_file.write("\nnameserver %s" % dns_server)
|
|
||||||
dns_file.close()
|
|
||||||
_execute(['/sbin/service', 'network', 'restart'])
|
|
||||||
|
|
||||||
|
|
||||||
def _set_ubuntu_networking(network_details=None):
|
|
||||||
"""Set IPv4 network settings for Ubuntu."""
|
|
||||||
network_details = network_details or []
|
|
||||||
all_dns_servers = []
|
|
||||||
interface_file_name = '/etc/network/interfaces'
|
|
||||||
# Remove file
|
|
||||||
os.remove(interface_file_name)
|
|
||||||
# Touch file
|
|
||||||
_execute(['touch', interface_file_name])
|
|
||||||
interface_file = open(interface_file_name, 'w')
|
|
||||||
for device, network_detail in enumerate(network_details):
|
|
||||||
(mac_address, ip_address, subnet_mask, gateway, broadcast,
|
|
||||||
dns_servers) = network_detail
|
|
||||||
all_dns_servers.extend(dns_servers)
|
|
||||||
name_and_ip = _get_linux_adapter_name_and_ip_address(mac_address)
|
|
||||||
adapter_name, current_ip_address = name_and_ip
|
|
||||||
|
|
||||||
if adapter_name:
|
|
||||||
interface_file.write('\nauto %s' % adapter_name)
|
|
||||||
interface_file.write('\niface %s inet static' % adapter_name)
|
|
||||||
interface_file.write('\nbroadcast %s' % broadcast)
|
|
||||||
interface_file.write('\ngateway %s' % gateway)
|
|
||||||
interface_file.write('\nnetmask %s' % subnet_mask)
|
|
||||||
interface_file.write('\naddress %s\n' % ip_address)
|
|
||||||
logging.debug(_("Successfully configured NIC %(device)d with "
|
|
||||||
"NIC info %(detail)s"), {'device': device,
|
|
||||||
'detail': network_detail})
|
|
||||||
interface_file.close()
|
|
||||||
|
|
||||||
if all_dns_servers:
|
|
||||||
dns_file_name = "/etc/resolv.conf"
|
|
||||||
os.remove(dns_file_name)
|
|
||||||
_execute(['touch', dns_file_name])
|
|
||||||
dns_file = open(dns_file_name, 'w')
|
|
||||||
dns_file.write("; generated by OpenStack guest tools")
|
|
||||||
unique_entries = _filter_duplicates(all_dns_servers)
|
|
||||||
for dns_server in unique_entries:
|
|
||||||
dns_file.write("\nnameserver %s" % dns_server)
|
|
||||||
dns_file.close()
|
|
||||||
|
|
||||||
logging.debug(_("Restarting networking....\n"))
|
|
||||||
_execute(['/etc/init.d/networking', 'restart'])
|
|
||||||
|
|
||||||
|
|
||||||
def _linux_set_networking():
|
|
||||||
"""Set IP address for the Linux VM."""
|
|
||||||
vmware_tools_bin = None
|
|
||||||
if os.path.exists('/usr/sbin/vmtoolsd'):
|
|
||||||
vmware_tools_bin = '/usr/sbin/vmtoolsd'
|
|
||||||
elif os.path.exists('/usr/bin/vmtoolsd'):
|
|
||||||
vmware_tools_bin = '/usr/bin/vmtoolsd'
|
|
||||||
elif os.path.exists('/usr/sbin/vmware-guestd'):
|
|
||||||
vmware_tools_bin = '/usr/sbin/vmware-guestd'
|
|
||||||
elif os.path.exists('/usr/bin/vmware-guestd'):
|
|
||||||
vmware_tools_bin = '/usr/bin/vmware-guestd'
|
|
||||||
if vmware_tools_bin:
|
|
||||||
cmd = [vmware_tools_bin, '--cmd', 'machine.id.get']
|
|
||||||
network_details = _parse_network_details(_execute(cmd,
|
|
||||||
check_exit_code=False))
|
|
||||||
# TODO(sateesh): For other distros like suse, debian, BSD, etc.
|
|
||||||
if(platform.dist()[0] == 'Ubuntu'):
|
|
||||||
_set_ubuntu_networking(network_details)
|
|
||||||
elif (platform.dist()[0] == 'redhat'):
|
|
||||||
_set_rhel_networking(network_details)
|
|
||||||
else:
|
|
||||||
logging.warn(_("Distro '%s' not supported") % platform.dist()[0])
|
|
||||||
else:
|
|
||||||
logging.warn(_("VMware Tools is not installed"))
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
pltfrm = sys.platform
|
|
||||||
if pltfrm == PLATFORM_WIN:
|
|
||||||
_windows_set_networking()
|
|
||||||
elif pltfrm == PLATFORM_LINUX:
|
|
||||||
_linux_set_networking()
|
|
||||||
else:
|
|
||||||
raise NotImplementedError(_("Platform not implemented: '%s'") % pltfrm)
|
|
15
flakes.py
15
flakes.py
@ -1,15 +0,0 @@
|
|||||||
"""
|
|
||||||
wrapper for pyflakes to ignore gettext based warning:
|
|
||||||
"undefined name '_'"
|
|
||||||
|
|
||||||
Synced in from openstack-common
|
|
||||||
"""
|
|
||||||
import sys
|
|
||||||
|
|
||||||
import pyflakes.checker
|
|
||||||
from pyflakes.scripts import pyflakes
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
orig_builtins = set(pyflakes.checker._MAGIC_GLOBALS)
|
|
||||||
pyflakes.checker._MAGIC_GLOBALS = orig_builtins | set(['_'])
|
|
||||||
sys.exit(pyflakes.main())
|
|
@ -1,74 +0,0 @@
|
|||||||
# vim: tabstop=4 shiftwidth=4 softtabstop=4
|
|
||||||
|
|
||||||
# Copyright 2010 United States Government as represented by the
|
|
||||||
# Administrator of the National Aeronautics and Space Administration.
|
|
||||||
# All Rights Reserved.
|
|
||||||
#
|
|
||||||
# Copyright 2010 OpenStack Foundation
|
|
||||||
# Copyright 2013 IBM Corp.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
|
|
||||||
import install_venv_common as install_venv
|
|
||||||
|
|
||||||
|
|
||||||
def print_help(venv, root):
|
|
||||||
help = """
|
|
||||||
Nova development environment setup is complete.
|
|
||||||
|
|
||||||
Nova development uses virtualenv to track and manage Python dependencies
|
|
||||||
while in development and testing.
|
|
||||||
|
|
||||||
To activate the Nova virtualenv for the extent of your current shell
|
|
||||||
session you can run:
|
|
||||||
|
|
||||||
$ source %s/bin/activate
|
|
||||||
|
|
||||||
Or, if you prefer, you can run commands in the virtualenv on a case by case
|
|
||||||
basis by running:
|
|
||||||
|
|
||||||
$ %s/tools/with_venv.sh <your command>
|
|
||||||
|
|
||||||
Also, make test will automatically use the virtualenv.
|
|
||||||
"""
|
|
||||||
print help % (venv, root)
|
|
||||||
|
|
||||||
|
|
||||||
def main(argv):
|
|
||||||
root = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
|
|
||||||
|
|
||||||
if os.environ.get('tools_path'):
|
|
||||||
root = os.environ['tools_path']
|
|
||||||
venv = os.path.join(root, '.venv')
|
|
||||||
if os.environ.get('venv'):
|
|
||||||
venv = os.environ['venv']
|
|
||||||
|
|
||||||
pip_requires = os.path.join(root, 'tools', 'pip-requires')
|
|
||||||
test_requires = os.path.join(root, 'tools', 'test-requires')
|
|
||||||
py_version = "python%s.%s" % (sys.version_info[0], sys.version_info[1])
|
|
||||||
project = 'Nova'
|
|
||||||
install = install_venv.InstallVenv(root, venv, pip_requires, test_requires,
|
|
||||||
py_version, project)
|
|
||||||
options = install.parse_args(argv)
|
|
||||||
install.check_python_version()
|
|
||||||
install.check_dependencies()
|
|
||||||
install.create_virtualenv(no_site_packages=options.no_site_packages)
|
|
||||||
install.install_dependencies()
|
|
||||||
install.post_process()
|
|
||||||
print_help(venv, root)
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
main(sys.argv)
|
|
@ -1,220 +0,0 @@
|
|||||||
# vim: tabstop=4 shiftwidth=4 softtabstop=4
|
|
||||||
|
|
||||||
# Copyright 2013 OpenStack Foundation
|
|
||||||
# Copyright 2013 IBM Corp.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
"""Provides methods needed by installation script for OpenStack development
|
|
||||||
virtual environments.
|
|
||||||
|
|
||||||
Synced in from openstack-common
|
|
||||||
"""
|
|
||||||
|
|
||||||
import argparse
|
|
||||||
import os
|
|
||||||
import subprocess
|
|
||||||
import sys
|
|
||||||
|
|
||||||
|
|
||||||
class InstallVenv(object):
|
|
||||||
|
|
||||||
def __init__(self, root, venv, pip_requires, test_requires, py_version,
|
|
||||||
project):
|
|
||||||
self.root = root
|
|
||||||
self.venv = venv
|
|
||||||
self.pip_requires = pip_requires
|
|
||||||
self.test_requires = test_requires
|
|
||||||
self.py_version = py_version
|
|
||||||
self.project = project
|
|
||||||
|
|
||||||
def die(self, message, *args):
|
|
||||||
print >> sys.stderr, message % args
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
def check_python_version(self):
|
|
||||||
if sys.version_info < (2, 6):
|
|
||||||
self.die("Need Python Version >= 2.6")
|
|
||||||
|
|
||||||
def run_command_with_code(self, cmd, redirect_output=True,
|
|
||||||
check_exit_code=True):
|
|
||||||
"""Runs a command in an out-of-process shell.
|
|
||||||
|
|
||||||
Returns the output of that command. Working directory is self.root.
|
|
||||||
"""
|
|
||||||
if redirect_output:
|
|
||||||
stdout = subprocess.PIPE
|
|
||||||
else:
|
|
||||||
stdout = None
|
|
||||||
|
|
||||||
proc = subprocess.Popen(cmd, cwd=self.root, stdout=stdout)
|
|
||||||
output = proc.communicate()[0]
|
|
||||||
if check_exit_code and proc.returncode != 0:
|
|
||||||
self.die('Command "%s" failed.\n%s', ' '.join(cmd), output)
|
|
||||||
return (output, proc.returncode)
|
|
||||||
|
|
||||||
def run_command(self, cmd, redirect_output=True, check_exit_code=True):
|
|
||||||
return self.run_command_with_code(cmd, redirect_output,
|
|
||||||
check_exit_code)[0]
|
|
||||||
|
|
||||||
def get_distro(self):
|
|
||||||
if (os.path.exists('/etc/fedora-release') or
|
|
||||||
os.path.exists('/etc/redhat-release')):
|
|
||||||
return Fedora(self.root, self.venv, self.pip_requires,
|
|
||||||
self.test_requires, self.py_version, self.project)
|
|
||||||
else:
|
|
||||||
return Distro(self.root, self.venv, self.pip_requires,
|
|
||||||
self.test_requires, self.py_version, self.project)
|
|
||||||
|
|
||||||
def check_dependencies(self):
|
|
||||||
self.get_distro().install_virtualenv()
|
|
||||||
|
|
||||||
def create_virtualenv(self, no_site_packages=True):
|
|
||||||
"""Creates the virtual environment and installs PIP.
|
|
||||||
|
|
||||||
Creates the virtual environment and installs PIP only into the
|
|
||||||
virtual environment.
|
|
||||||
"""
|
|
||||||
if not os.path.isdir(self.venv):
|
|
||||||
print 'Creating venv...',
|
|
||||||
if no_site_packages:
|
|
||||||
self.run_command(['virtualenv', '-q', '--no-site-packages',
|
|
||||||
self.venv])
|
|
||||||
else:
|
|
||||||
self.run_command(['virtualenv', '-q', self.venv])
|
|
||||||
print 'done.'
|
|
||||||
print 'Installing pip in venv...',
|
|
||||||
if not self.run_command(['tools/with_venv.sh', 'easy_install',
|
|
||||||
'pip>1.0']).strip():
|
|
||||||
self.die("Failed to install pip.")
|
|
||||||
print 'done.'
|
|
||||||
else:
|
|
||||||
print "venv already exists..."
|
|
||||||
pass
|
|
||||||
|
|
||||||
def pip_install(self, *args):
|
|
||||||
self.run_command(['tools/with_venv.sh',
|
|
||||||
'pip', 'install', '--upgrade'] + list(args),
|
|
||||||
redirect_output=False)
|
|
||||||
|
|
||||||
def install_dependencies(self):
|
|
||||||
print 'Installing dependencies with pip (this can take a while)...'
|
|
||||||
|
|
||||||
# First things first, make sure our venv has the latest pip and
|
|
||||||
# distribute.
|
|
||||||
# NOTE: we keep pip at version 1.1 since the most recent version causes
|
|
||||||
# the .venv creation to fail. See:
|
|
||||||
# https://bugs.launchpad.net/nova/+bug/1047120
|
|
||||||
self.pip_install('pip==1.1')
|
|
||||||
self.pip_install('distribute')
|
|
||||||
|
|
||||||
# Install greenlet by hand - just listing it in the requires file does
|
|
||||||
# not
|
|
||||||
# get it installed in the right order
|
|
||||||
self.pip_install('greenlet')
|
|
||||||
|
|
||||||
self.pip_install('-r', self.pip_requires)
|
|
||||||
self.pip_install('-r', self.test_requires)
|
|
||||||
|
|
||||||
def post_process(self):
|
|
||||||
self.get_distro().post_process()
|
|
||||||
|
|
||||||
def parse_args(self, argv):
|
|
||||||
"""Parses command-line arguments."""
|
|
||||||
parser = argparse.ArgumentParser()
|
|
||||||
parser.add_argument('-n', '--no-site-packages',
|
|
||||||
action='store_true',
|
|
||||||
help="Do not inherit packages from global Python "
|
|
||||||
"install")
|
|
||||||
return parser.parse_args(argv[1:])
|
|
||||||
|
|
||||||
|
|
||||||
class Distro(InstallVenv):
|
|
||||||
|
|
||||||
def check_cmd(self, cmd):
|
|
||||||
return bool(self.run_command(['which', cmd],
|
|
||||||
check_exit_code=False).strip())
|
|
||||||
|
|
||||||
def install_virtualenv(self):
|
|
||||||
if self.check_cmd('virtualenv'):
|
|
||||||
return
|
|
||||||
|
|
||||||
if self.check_cmd('easy_install'):
|
|
||||||
print 'Installing virtualenv via easy_install...',
|
|
||||||
if self.run_command(['easy_install', 'virtualenv']):
|
|
||||||
print 'Succeeded'
|
|
||||||
return
|
|
||||||
else:
|
|
||||||
print 'Failed'
|
|
||||||
|
|
||||||
self.die('ERROR: virtualenv not found.\n\n%s development'
|
|
||||||
' requires virtualenv, please install it using your'
|
|
||||||
' favorite package management tool' % self.project)
|
|
||||||
|
|
||||||
def post_process(self):
|
|
||||||
"""Any distribution-specific post-processing gets done here.
|
|
||||||
|
|
||||||
In particular, this is useful for applying patches to code inside
|
|
||||||
the venv.
|
|
||||||
"""
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
class Fedora(Distro):
|
|
||||||
"""This covers all Fedora-based distributions.
|
|
||||||
|
|
||||||
Includes: Fedora, RHEL, CentOS, Scientific Linux
|
|
||||||
"""
|
|
||||||
|
|
||||||
def check_pkg(self, pkg):
|
|
||||||
return self.run_command_with_code(['rpm', '-q', pkg],
|
|
||||||
check_exit_code=False)[1] == 0
|
|
||||||
|
|
||||||
def yum_install(self, pkg, **kwargs):
|
|
||||||
print "Attempting to install '%s' via yum" % pkg
|
|
||||||
self.run_command(['sudo', 'yum', 'install', '-y', pkg], **kwargs)
|
|
||||||
|
|
||||||
def apply_patch(self, originalfile, patchfile):
|
|
||||||
self.run_command(['patch', '-N', originalfile, patchfile],
|
|
||||||
check_exit_code=False)
|
|
||||||
|
|
||||||
def install_virtualenv(self):
|
|
||||||
if self.check_cmd('virtualenv'):
|
|
||||||
return
|
|
||||||
|
|
||||||
if not self.check_pkg('python-virtualenv'):
|
|
||||||
self.yum_install('python-virtualenv', check_exit_code=False)
|
|
||||||
|
|
||||||
super(Fedora, self).install_virtualenv()
|
|
||||||
|
|
||||||
def post_process(self):
|
|
||||||
"""Workaround for a bug in eventlet.
|
|
||||||
|
|
||||||
This currently affects RHEL6.1, but the fix can safely be
|
|
||||||
applied to all RHEL and Fedora distributions.
|
|
||||||
|
|
||||||
This can be removed when the fix is applied upstream.
|
|
||||||
|
|
||||||
Nova: https://bugs.launchpad.net/nova/+bug/884915
|
|
||||||
Upstream: https://bitbucket.org/which_linden/eventlet/issue/89
|
|
||||||
"""
|
|
||||||
|
|
||||||
# Install "patch" program if it's not there
|
|
||||||
if not self.check_pkg('patch'):
|
|
||||||
self.yum_install('patch')
|
|
||||||
|
|
||||||
# Apply the eventlet patch
|
|
||||||
self.apply_patch(os.path.join(self.venv, 'lib', self.py_version,
|
|
||||||
'site-packages',
|
|
||||||
'eventlet/green/subprocess.py'),
|
|
||||||
'contrib/redhat-eventlet.patch')
|
|
199
lintstack.py
199
lintstack.py
@ -1,199 +0,0 @@
|
|||||||
#!/usr/bin/env python
|
|
||||||
# vim: tabstop=4 shiftwidth=4 softtabstop=4
|
|
||||||
|
|
||||||
# Copyright (c) 2012, AT&T Labs, Yun Mao <yunmao@gmail.com>
|
|
||||||
# All Rights Reserved.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
"""pylint error checking."""
|
|
||||||
|
|
||||||
import cStringIO as StringIO
|
|
||||||
import json
|
|
||||||
import re
|
|
||||||
import sys
|
|
||||||
|
|
||||||
from pylint import lint
|
|
||||||
from pylint.reporters import text
|
|
||||||
|
|
||||||
# Note(maoy): E1103 is error code related to partial type inference
|
|
||||||
ignore_codes = ["E1103"]
|
|
||||||
# Note(maoy): the error message is the pattern of E0202. It should be ignored
|
|
||||||
# for nova.tests modules
|
|
||||||
ignore_messages = ["An attribute affected in nova.tests"]
|
|
||||||
# Note(maoy): we ignore all errors in openstack.common because it should be
|
|
||||||
# checked elsewhere. We also ignore nova.tests for now due to high false
|
|
||||||
# positive rate.
|
|
||||||
ignore_modules = ["nova/openstack/common/", "nova/tests/"]
|
|
||||||
|
|
||||||
KNOWN_PYLINT_EXCEPTIONS_FILE = "tools/pylint_exceptions"
|
|
||||||
|
|
||||||
|
|
||||||
class LintOutput(object):
|
|
||||||
|
|
||||||
_cached_filename = None
|
|
||||||
_cached_content = None
|
|
||||||
|
|
||||||
def __init__(self, filename, lineno, line_content, code, message,
|
|
||||||
lintoutput):
|
|
||||||
self.filename = filename
|
|
||||||
self.lineno = lineno
|
|
||||||
self.line_content = line_content
|
|
||||||
self.code = code
|
|
||||||
self.message = message
|
|
||||||
self.lintoutput = lintoutput
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def from_line(cls, line):
|
|
||||||
m = re.search(r"(\S+):(\d+): \[(\S+)(, \S+)?] (.*)", line)
|
|
||||||
matched = m.groups()
|
|
||||||
filename, lineno, code, message = (matched[0], int(matched[1]),
|
|
||||||
matched[2], matched[-1])
|
|
||||||
if cls._cached_filename != filename:
|
|
||||||
with open(filename) as f:
|
|
||||||
cls._cached_content = list(f.readlines())
|
|
||||||
cls._cached_filename = filename
|
|
||||||
line_content = cls._cached_content[lineno - 1].rstrip()
|
|
||||||
return cls(filename, lineno, line_content, code, message,
|
|
||||||
line.rstrip())
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def from_msg_to_dict(cls, msg):
|
|
||||||
"""From the output of pylint msg, to a dict, where each key
|
|
||||||
is a unique error identifier, value is a list of LintOutput
|
|
||||||
"""
|
|
||||||
result = {}
|
|
||||||
for line in msg.splitlines():
|
|
||||||
obj = cls.from_line(line)
|
|
||||||
if obj.is_ignored():
|
|
||||||
continue
|
|
||||||
key = obj.key()
|
|
||||||
if key not in result:
|
|
||||||
result[key] = []
|
|
||||||
result[key].append(obj)
|
|
||||||
return result
|
|
||||||
|
|
||||||
def is_ignored(self):
|
|
||||||
if self.code in ignore_codes:
|
|
||||||
return True
|
|
||||||
if any(self.filename.startswith(name) for name in ignore_modules):
|
|
||||||
return True
|
|
||||||
if any(msg in self.message for msg in ignore_messages):
|
|
||||||
return True
|
|
||||||
return False
|
|
||||||
|
|
||||||
def key(self):
|
|
||||||
if self.code in ["E1101", "E1103"]:
|
|
||||||
# These two types of errors are like Foo class has no member bar.
|
|
||||||
# We discard the source code so that the error will be ignored
|
|
||||||
# next time another Foo.bar is encountered.
|
|
||||||
return self.message, ""
|
|
||||||
return self.message, self.line_content.strip()
|
|
||||||
|
|
||||||
def json(self):
|
|
||||||
return json.dumps(self.__dict__)
|
|
||||||
|
|
||||||
def review_str(self):
|
|
||||||
return ("File %(filename)s\nLine %(lineno)d:%(line_content)s\n"
|
|
||||||
"%(code)s: %(message)s" % self.__dict__)
|
|
||||||
|
|
||||||
|
|
||||||
class ErrorKeys(object):
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def print_json(cls, errors, output=sys.stdout):
|
|
||||||
print >>output, "# automatically generated by tools/lintstack.py"
|
|
||||||
for i in sorted(errors.keys()):
|
|
||||||
print >>output, json.dumps(i)
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def from_file(cls, filename):
|
|
||||||
keys = set()
|
|
||||||
for line in open(filename):
|
|
||||||
if line and line[0] != "#":
|
|
||||||
d = json.loads(line)
|
|
||||||
keys.add(tuple(d))
|
|
||||||
return keys
|
|
||||||
|
|
||||||
|
|
||||||
def run_pylint():
|
|
||||||
buff = StringIO.StringIO()
|
|
||||||
reporter = text.ParseableTextReporter(output=buff)
|
|
||||||
args = ["--include-ids=y", "-E", "nova"]
|
|
||||||
lint.Run(args, reporter=reporter, exit=False)
|
|
||||||
val = buff.getvalue()
|
|
||||||
buff.close()
|
|
||||||
return val
|
|
||||||
|
|
||||||
|
|
||||||
def generate_error_keys(msg=None):
|
|
||||||
print "Generating", KNOWN_PYLINT_EXCEPTIONS_FILE
|
|
||||||
if msg is None:
|
|
||||||
msg = run_pylint()
|
|
||||||
errors = LintOutput.from_msg_to_dict(msg)
|
|
||||||
with open(KNOWN_PYLINT_EXCEPTIONS_FILE, "w") as f:
|
|
||||||
ErrorKeys.print_json(errors, output=f)
|
|
||||||
|
|
||||||
|
|
||||||
def validate(newmsg=None):
|
|
||||||
print "Loading", KNOWN_PYLINT_EXCEPTIONS_FILE
|
|
||||||
known = ErrorKeys.from_file(KNOWN_PYLINT_EXCEPTIONS_FILE)
|
|
||||||
if newmsg is None:
|
|
||||||
print "Running pylint. Be patient..."
|
|
||||||
newmsg = run_pylint()
|
|
||||||
errors = LintOutput.from_msg_to_dict(newmsg)
|
|
||||||
|
|
||||||
print "Unique errors reported by pylint: was %d, now %d." \
|
|
||||||
% (len(known), len(errors))
|
|
||||||
passed = True
|
|
||||||
for err_key, err_list in errors.items():
|
|
||||||
for err in err_list:
|
|
||||||
if err_key not in known:
|
|
||||||
print err.lintoutput
|
|
||||||
print
|
|
||||||
passed = False
|
|
||||||
if passed:
|
|
||||||
print "Congrats! pylint check passed."
|
|
||||||
redundant = known - set(errors.keys())
|
|
||||||
if redundant:
|
|
||||||
print "Extra credit: some known pylint exceptions disappeared."
|
|
||||||
for i in sorted(redundant):
|
|
||||||
print json.dumps(i)
|
|
||||||
print "Consider regenerating the exception file if you will."
|
|
||||||
else:
|
|
||||||
print ("Please fix the errors above. If you believe they are false"
|
|
||||||
" positives, run 'tools/lintstack.py generate' to overwrite.")
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
|
|
||||||
def usage():
|
|
||||||
print """Usage: tools/lintstack.py [generate|validate]
|
|
||||||
To generate pylint_exceptions file: tools/lintstack.py generate
|
|
||||||
To validate the current commit: tools/lintstack.py
|
|
||||||
"""
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
option = "validate"
|
|
||||||
if len(sys.argv) > 1:
|
|
||||||
option = sys.argv[1]
|
|
||||||
if option == "generate":
|
|
||||||
generate_error_keys()
|
|
||||||
elif option == "validate":
|
|
||||||
validate()
|
|
||||||
else:
|
|
||||||
usage()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
59
lintstack.sh
59
lintstack.sh
@ -1,59 +0,0 @@
|
|||||||
#!/usr/bin/env bash
|
|
||||||
|
|
||||||
# Copyright (c) 2012-2013, AT&T Labs, Yun Mao <yunmao@gmail.com>
|
|
||||||
# All Rights Reserved.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
# Use lintstack.py to compare pylint errors.
|
|
||||||
# We run pylint twice, once on HEAD, once on the code before the latest
|
|
||||||
# commit for review.
|
|
||||||
set -e
|
|
||||||
TOOLS_DIR=$(cd $(dirname "$0") && pwd)
|
|
||||||
# Get the current branch name.
|
|
||||||
GITHEAD=`git rev-parse --abbrev-ref HEAD`
|
|
||||||
if [[ "$GITHEAD" == "HEAD" ]]; then
|
|
||||||
# In detached head mode, get revision number instead
|
|
||||||
GITHEAD=`git rev-parse HEAD`
|
|
||||||
echo "Currently we are at commit $GITHEAD"
|
|
||||||
else
|
|
||||||
echo "Currently we are at branch $GITHEAD"
|
|
||||||
fi
|
|
||||||
|
|
||||||
cp -f $TOOLS_DIR/lintstack.py $TOOLS_DIR/lintstack.head.py
|
|
||||||
|
|
||||||
if git rev-parse HEAD^2 2>/dev/null; then
|
|
||||||
# The HEAD is a Merge commit. Here, the patch to review is
|
|
||||||
# HEAD^2, the master branch is at HEAD^1, and the patch was
|
|
||||||
# written based on HEAD^2~1.
|
|
||||||
PREV_COMMIT=`git rev-parse HEAD^2~1`
|
|
||||||
git checkout HEAD~1
|
|
||||||
# The git merge is necessary for reviews with a series of patches.
|
|
||||||
# If not, this is a no-op so won't hurt either.
|
|
||||||
git merge $PREV_COMMIT
|
|
||||||
else
|
|
||||||
# The HEAD is not a merge commit. This won't happen on gerrit.
|
|
||||||
# Most likely you are running against your own patch locally.
|
|
||||||
# We assume the patch to examine is HEAD, and we compare it against
|
|
||||||
# HEAD~1
|
|
||||||
git checkout HEAD~1
|
|
||||||
fi
|
|
||||||
|
|
||||||
# First generate tools/pylint_exceptions from HEAD~1
|
|
||||||
$TOOLS_DIR/lintstack.head.py generate
|
|
||||||
# Then use that as a reference to compare against HEAD
|
|
||||||
git checkout $GITHEAD
|
|
||||||
$TOOLS_DIR/lintstack.head.py
|
|
||||||
echo "Check passed. FYI: the pylint exceptions are:"
|
|
||||||
cat $TOOLS_DIR/pylint_exceptions
|
|
||||||
|
|
@ -1,37 +0,0 @@
|
|||||||
# bash completion for openstack nova-manage
|
|
||||||
|
|
||||||
_nova_manage_opts="" # lazy init
|
|
||||||
_nova_manage_opts_exp="" # lazy init
|
|
||||||
|
|
||||||
# dict hack for bash 3
|
|
||||||
_set_nova_manage_subopts () {
|
|
||||||
eval _nova_manage_subopts_"$1"='$2'
|
|
||||||
}
|
|
||||||
_get_nova_manage_subopts () {
|
|
||||||
eval echo '${_nova_manage_subopts_'"$1"'#_nova_manage_subopts_}'
|
|
||||||
}
|
|
||||||
|
|
||||||
_nova_manage()
|
|
||||||
{
|
|
||||||
local cur prev subopts
|
|
||||||
COMPREPLY=()
|
|
||||||
cur="${COMP_WORDS[COMP_CWORD]}"
|
|
||||||
prev="${COMP_WORDS[COMP_CWORD-1]}"
|
|
||||||
|
|
||||||
if [ "x$_nova_manage_opts" == "x" ] ; then
|
|
||||||
_nova_manage_opts="`nova-manage bash-completion 2>/dev/null`"
|
|
||||||
_nova_manage_opts_exp="`echo $_nova_manage_opts | sed -e "s/\s/|/g"`"
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [[ " `echo $_nova_manage_opts` " =~ " $prev " ]] ; then
|
|
||||||
if [ "x$(_get_nova_manage_subopts "$prev")" == "x" ] ; then
|
|
||||||
subopts="`nova-manage bash-completion $prev 2>/dev/null`"
|
|
||||||
_set_nova_manage_subopts "$prev" "$subopts"
|
|
||||||
fi
|
|
||||||
COMPREPLY=($(compgen -W "$(_get_nova_manage_subopts "$prev")" -- ${cur}))
|
|
||||||
elif [[ ! " ${COMP_WORDS[@]} " =~ " "($_nova_manage_opts_exp)" " ]] ; then
|
|
||||||
COMPREPLY=($(compgen -W "${_nova_manage_opts}" -- ${cur}))
|
|
||||||
fi
|
|
||||||
return 0
|
|
||||||
}
|
|
||||||
complete -F _nova_manage nova-manage
|
|
@ -1,38 +0,0 @@
|
|||||||
# vim: tabstop=4 shiftwidth=4 softtabstop=4
|
|
||||||
|
|
||||||
# Copyright 2013 Red Hat, Inc.
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
|
||||||
# not use this file except in compliance with the License. You may obtain
|
|
||||||
# a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
|
||||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
|
||||||
# License for the specific language governing permissions and limitations
|
|
||||||
# under the License.
|
|
||||||
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
|
|
||||||
import install_venv_common as install_venv
|
|
||||||
|
|
||||||
|
|
||||||
def main(argv):
|
|
||||||
root = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
|
|
||||||
|
|
||||||
venv = os.environ['VIRTUAL_ENV']
|
|
||||||
|
|
||||||
pip_requires = os.path.join(root, 'tools', 'pip-requires')
|
|
||||||
test_requires = os.path.join(root, 'tools', 'test-requires')
|
|
||||||
py_version = "python%s.%s" % (sys.version_info[0], sys.version_info[1])
|
|
||||||
project = 'Nova'
|
|
||||||
install = install_venv.InstallVenv(root, venv, pip_requires, test_requires,
|
|
||||||
py_version, project)
|
|
||||||
#NOTE(dprince): For Tox we only run post_process (which patches files, etc)
|
|
||||||
install.post_process()
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
main(sys.argv)
|
|
30
pip-requires
30
pip-requires
@ -1,30 +0,0 @@
|
|||||||
SQLAlchemy>=0.7.8,<0.7.99
|
|
||||||
Cheetah>=2.4.4
|
|
||||||
amqplib>=0.6.1
|
|
||||||
anyjson>=0.2.4
|
|
||||||
argparse
|
|
||||||
boto
|
|
||||||
eventlet>=0.9.17
|
|
||||||
kombu>=1.0.4
|
|
||||||
lxml>=2.3
|
|
||||||
routes>=1.12.3
|
|
||||||
WebOb==1.2.3
|
|
||||||
greenlet>=0.3.1
|
|
||||||
PasteDeploy>=1.5.0
|
|
||||||
paste
|
|
||||||
sqlalchemy-migrate>=0.7.2
|
|
||||||
netaddr
|
|
||||||
suds>=0.4
|
|
||||||
paramiko
|
|
||||||
pyasn1
|
|
||||||
Babel>=0.9.6
|
|
||||||
iso8601>=0.1.4
|
|
||||||
httplib2
|
|
||||||
setuptools_git>=0.4
|
|
||||||
python-cinderclient>=1.0.1
|
|
||||||
python-quantumclient>=2.2.0,<3.0.0
|
|
||||||
python-glanceclient>=0.5.0,<2
|
|
||||||
python-keystoneclient>=0.2.0
|
|
||||||
stevedore>=0.7
|
|
||||||
websockify<0.4
|
|
||||||
oslo.config>=1.1.0
|
|
@ -1,81 +0,0 @@
|
|||||||
#!/usr/bin/env python
|
|
||||||
|
|
||||||
"""Tool for checking if patch contains a regression test.
|
|
||||||
|
|
||||||
Pass in gerrit review number as parameter, tool will download branch and run
|
|
||||||
modified tests without bug fix.
|
|
||||||
"""
|
|
||||||
|
|
||||||
import string
|
|
||||||
import subprocess
|
|
||||||
import sys
|
|
||||||
|
|
||||||
gerrit_number = None
|
|
||||||
|
|
||||||
#TODO(jogo) use proper optParser
|
|
||||||
if len(sys.argv) == 2:
|
|
||||||
gerrit_number = sys.argv[1]
|
|
||||||
else:
|
|
||||||
gerrit_number = None
|
|
||||||
print ("no gerrit review number specified, running on latest commit"
|
|
||||||
"on current branch.")
|
|
||||||
|
|
||||||
|
|
||||||
def run(cmd, fail_ok=False):
|
|
||||||
print "running: %s" % cmd
|
|
||||||
try:
|
|
||||||
rval = subprocess.check_output(cmd, shell=True)
|
|
||||||
except subprocess.CalledProcessError:
|
|
||||||
if not fail_ok:
|
|
||||||
print "The command above terminated with an error."
|
|
||||||
sys.exit(1)
|
|
||||||
pass
|
|
||||||
return rval
|
|
||||||
|
|
||||||
|
|
||||||
test_works = False
|
|
||||||
|
|
||||||
if gerrit_number:
|
|
||||||
original_branch = run("git rev-parse --abbrev-ref HEAD")
|
|
||||||
run("git review -d %s" % gerrit_number)
|
|
||||||
|
|
||||||
# run new tests with old code
|
|
||||||
run("git checkout HEAD^ nova")
|
|
||||||
run("git checkout HEAD nova/tests")
|
|
||||||
|
|
||||||
# identify which tests have changed
|
|
||||||
tests = run("git whatchanged --format=oneline -1 | grep \"nova/tests\" "
|
|
||||||
"| cut -f2").split()
|
|
||||||
test_list = []
|
|
||||||
for test in tests:
|
|
||||||
test_list.append(string.replace(test[0:-3], '/', '.'))
|
|
||||||
|
|
||||||
if test_list == []:
|
|
||||||
test_works = False
|
|
||||||
expect_failure = ""
|
|
||||||
else:
|
|
||||||
# run new tests, expect them to fail
|
|
||||||
expect_failure = run(("tox -epy27 %s 2>&1" % string.join(test_list)),
|
|
||||||
fail_ok=True)
|
|
||||||
if "FAILED (id=" in expect_failure:
|
|
||||||
test_works = True
|
|
||||||
|
|
||||||
# cleanup
|
|
||||||
run("git checkout HEAD nova")
|
|
||||||
if gerrit_number:
|
|
||||||
new_branch = run("git status | head -1 | cut -d ' ' -f 4")
|
|
||||||
run("git checkout %s" % original_branch)
|
|
||||||
run("git branch -D %s" % new_branch)
|
|
||||||
|
|
||||||
|
|
||||||
if test_works:
|
|
||||||
print expect_failure
|
|
||||||
print ""
|
|
||||||
print "*******************************"
|
|
||||||
print "FOUND a regression test"
|
|
||||||
else:
|
|
||||||
print expect_failure
|
|
||||||
print ""
|
|
||||||
print "*******************************"
|
|
||||||
print "NO regression test"
|
|
||||||
sys.exit(1)
|
|
3
requirements.txt
Normal file
3
requirements.txt
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
d2to1
|
||||||
|
flake8
|
||||||
|
pbr
|
27
run_pep8.sh
27
run_pep8.sh
@ -1,27 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
|
|
||||||
set -e
|
|
||||||
# This is used by run_tests.sh and tox.ini
|
|
||||||
python tools/hacking.py --doctest
|
|
||||||
|
|
||||||
# Until all these issues get fixed, ignore.
|
|
||||||
PEP8='python tools/hacking.py --ignore=E12,E711,E721,E712,N303,N403,N404'
|
|
||||||
|
|
||||||
EXCLUDE='--exclude=.venv,.git,.tox,dist,doc,*openstack/common*,*lib/python*'
|
|
||||||
EXCLUDE+=',*egg,build,./plugins/xenserver/networking/etc/xensource/scripts'
|
|
||||||
EXCLUDE+=',./plugins/xenserver/xenapi/etc/xapi.d/plugins'
|
|
||||||
${PEP8} ${EXCLUDE} .
|
|
||||||
|
|
||||||
${PEP8} --filename=nova* bin
|
|
||||||
|
|
||||||
SCRIPT_ROOT=$(echo $(cd "$(dirname $0)"; pwd) | sed s/\\/tools//)
|
|
||||||
|
|
||||||
SCRIPTS_PATH=${SCRIPT_ROOT}/plugins/xenserver/networking/etc/xensource/scripts
|
|
||||||
PYTHONPATH=${SCRIPTS_PATH} ${PEP8} ./plugins/xenserver/networking
|
|
||||||
|
|
||||||
# NOTE(sirp): Also check Dom0 plugins w/o .py extension
|
|
||||||
PLUGINS_PATH=${SCRIPT_ROOT}/plugins/xenserver/xenapi/etc/xapi.d/plugins
|
|
||||||
PYTHONPATH=${PLUGINS_PATH} ${PEP8} ./plugins/xenserver/xenapi \
|
|
||||||
`find plugins/xenserver/xenapi/etc/xapi.d/plugins -type f -perm +111`
|
|
||||||
|
|
||||||
! pyflakes nova/ | grep "imported but unused\|redefinition of function"
|
|
48
setup.cfg
Normal file
48
setup.cfg
Normal file
@ -0,0 +1,48 @@
|
|||||||
|
[metadata]
|
||||||
|
name = hacking
|
||||||
|
author = OpenStack
|
||||||
|
author-email = openstack-dev@lists.openstack.org
|
||||||
|
summary = OpenStack Hacking Guidline Enforcement
|
||||||
|
description-file =
|
||||||
|
README.rst
|
||||||
|
home-page = http://pypi.python.org/pypi/hacking
|
||||||
|
classifier =
|
||||||
|
Development Status :: 4 - Beta
|
||||||
|
Environment :: Console
|
||||||
|
Environment :: OpenStack
|
||||||
|
Intended Audience :: Developers
|
||||||
|
Intended Audience :: Information Technology
|
||||||
|
License :: OSI Approved :: Apache Software License
|
||||||
|
Operating System :: OS Independent
|
||||||
|
Programming Language :: Python
|
||||||
|
|
||||||
|
[files]
|
||||||
|
packages =
|
||||||
|
hacking
|
||||||
|
|
||||||
|
[global]
|
||||||
|
setup-hooks =
|
||||||
|
pbr.hooks.setup_hook
|
||||||
|
|
||||||
|
[entry_points]
|
||||||
|
flake8.extension =
|
||||||
|
H101 = hacking:hacking_todo_format
|
||||||
|
H201 = hacking:hacking_except_format
|
||||||
|
H202 = hacking:hacking_except_format_assert
|
||||||
|
H301 = hacking:hacking_import_rules
|
||||||
|
H306 = hacking:hacking_import_alphabetical
|
||||||
|
H307 = hacking:hacking_import_no_db_in_virt
|
||||||
|
H401 = hacking:hacking_docstring_start_space
|
||||||
|
H402 = hacking:hacking_docstring_one_line
|
||||||
|
H403 = hacking:hacking_docstring_multiline_end
|
||||||
|
H404 = hacking:hacking_docstring_multiline_start
|
||||||
|
H601 = hacking:hacking_no_cr
|
||||||
|
H700 = hacking:hacking_localization_strings
|
||||||
|
H901 = hacking:hacking_is_not
|
||||||
|
H902 = hacking:hacking_not_in
|
||||||
|
|
||||||
|
[egg_info]
|
||||||
|
tag_build =
|
||||||
|
tag_date = 0
|
||||||
|
tag_svn_revision = 0
|
||||||
|
|
21
setup.py
Executable file
21
setup.py
Executable file
@ -0,0 +1,21 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||||
|
# implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
|
import setuptools
|
||||||
|
|
||||||
|
setuptools.setup(
|
||||||
|
setup_requires=['d2to1', 'pbr'],
|
||||||
|
d2to1=True)
|
7
test-requirements.txt
Normal file
7
test-requirements.txt
Normal file
@ -0,0 +1,7 @@
|
|||||||
|
coverage>=3.6
|
||||||
|
discover
|
||||||
|
fixtures>=0.3.12
|
||||||
|
python-subunit
|
||||||
|
sphinx>=1.1.2
|
||||||
|
testrepository>=0.0.13
|
||||||
|
testtools>=0.9.27
|
@ -1,16 +0,0 @@
|
|||||||
# Packages needed for dev testing
|
|
||||||
distribute>=0.6.24
|
|
||||||
|
|
||||||
coverage>=3.6
|
|
||||||
discover
|
|
||||||
feedparser
|
|
||||||
fixtures>=0.3.12
|
|
||||||
flake8
|
|
||||||
hacking
|
|
||||||
mox==0.5.3
|
|
||||||
MySQL-python
|
|
||||||
psycopg2
|
|
||||||
python-subunit
|
|
||||||
sphinx>=1.1.2
|
|
||||||
testrepository>=0.0.13
|
|
||||||
testtools>=0.9.27
|
|
31
tox.ini
Normal file
31
tox.ini
Normal file
@ -0,0 +1,31 @@
|
|||||||
|
[tox]
|
||||||
|
envlist = py26,py27,pep8
|
||||||
|
|
||||||
|
[testenv]
|
||||||
|
setenv = VIRTUAL_ENV={envdir}
|
||||||
|
LANG=en_US.UTF-8
|
||||||
|
LANGUAGE=en_US:en
|
||||||
|
LC_ALL=C
|
||||||
|
deps = -r{toxinidir}/requirements.txt
|
||||||
|
-r{toxinidir}/test-requirements.txt
|
||||||
|
commands =
|
||||||
|
python setup.py testr --slowest --testr-args='{posargs}'
|
||||||
|
|
||||||
|
[tox:jenkins]
|
||||||
|
sitepackages = True
|
||||||
|
downloadcache = ~/cache/pip
|
||||||
|
|
||||||
|
[testenv:pep8]
|
||||||
|
commands = flake8
|
||||||
|
|
||||||
|
[testenv:cover]
|
||||||
|
setenv = VIRTUAL_ENV={envdir}
|
||||||
|
commands =
|
||||||
|
python setup.py testr --coverage
|
||||||
|
|
||||||
|
[testenv:venv]
|
||||||
|
commands = {posargs}
|
||||||
|
|
||||||
|
[flake8]
|
||||||
|
exclude = .venv,.tox,dist,doc,*.egg
|
||||||
|
show-source = true
|
@ -1,7 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
tools_path=${tools_path:-$(dirname $0)}
|
|
||||||
venv_path=${venv_path:-${tools_path}}
|
|
||||||
venv_dir=${venv_name:-/../.venv}
|
|
||||||
TOOLS=${tools_path}
|
|
||||||
VENV=${venv:-${venv_path}/${venv_dir}}
|
|
||||||
source ${VENV}/bin/activate && "$@"
|
|
@ -1,123 +0,0 @@
|
|||||||
#!/usr/bin/env python
|
|
||||||
|
|
||||||
# Copyright 2013 OpenStack Foundation
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
|
||||||
# you may not use this file except in compliance with the License.
|
|
||||||
# You may obtain a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
|
||||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
||||||
# See the License for the specific language governing permissions and
|
|
||||||
# limitations under the License.
|
|
||||||
"""
|
|
||||||
Script to cleanup old XenServer /var/lock/sm locks.
|
|
||||||
|
|
||||||
XenServer 5.6 and 6.0 do not appear to always cleanup locks when using a
|
|
||||||
FileSR. ext3 has a limit of 32K inode links, so when we have 32K-2 (31998)
|
|
||||||
locks laying around, builds will begin to fail because we can't create any
|
|
||||||
additional locks. This cleanup script is something we can run periodically as
|
|
||||||
a stop-gap measure until this is fixed upstream.
|
|
||||||
|
|
||||||
This script should be run on the dom0 of the affected machine.
|
|
||||||
"""
|
|
||||||
import errno
|
|
||||||
import optparse
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
import time
|
|
||||||
|
|
||||||
BASE = '/var/lock/sm'
|
|
||||||
|
|
||||||
|
|
||||||
def _get_age_days(secs):
|
|
||||||
return float(time.time() - secs) / 86400
|
|
||||||
|
|
||||||
|
|
||||||
def _parse_args():
|
|
||||||
parser = optparse.OptionParser()
|
|
||||||
parser.add_option("-d", "--dry-run",
|
|
||||||
action="store_true", dest="dry_run", default=False,
|
|
||||||
help="don't actually remove locks")
|
|
||||||
parser.add_option("-l", "--limit",
|
|
||||||
action="store", type='int', dest="limit",
|
|
||||||
default=sys.maxint,
|
|
||||||
help="max number of locks to delete (default: no limit)")
|
|
||||||
parser.add_option("-v", "--verbose",
|
|
||||||
action="store_true", dest="verbose", default=False,
|
|
||||||
help="don't print status messages to stdout")
|
|
||||||
|
|
||||||
options, args = parser.parse_args()
|
|
||||||
|
|
||||||
try:
|
|
||||||
days_old = int(args[0])
|
|
||||||
except (IndexError, ValueError):
|
|
||||||
parser.print_help()
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
return options, days_old
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
options, days_old = _parse_args()
|
|
||||||
|
|
||||||
if not os.path.exists(BASE):
|
|
||||||
print >> sys.stderr, "error: '%s' doesn't exist. Make sure you're"\
|
|
||||||
" running this on the dom0." % BASE
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
lockpaths_removed = 0
|
|
||||||
nspaths_removed = 0
|
|
||||||
|
|
||||||
for nsname in os.listdir(BASE)[:options.limit]:
|
|
||||||
nspath = os.path.join(BASE, nsname)
|
|
||||||
|
|
||||||
if not os.path.isdir(nspath):
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Remove old lockfiles
|
|
||||||
removed = 0
|
|
||||||
locknames = os.listdir(nspath)
|
|
||||||
for lockname in locknames:
|
|
||||||
lockpath = os.path.join(nspath, lockname)
|
|
||||||
lock_age_days = _get_age_days(os.path.getmtime(lockpath))
|
|
||||||
if lock_age_days > days_old:
|
|
||||||
lockpaths_removed += 1
|
|
||||||
removed += 1
|
|
||||||
|
|
||||||
if options.verbose:
|
|
||||||
print 'Removing old lock: %03d %s' % (lock_age_days,
|
|
||||||
lockpath)
|
|
||||||
|
|
||||||
if not options.dry_run:
|
|
||||||
os.unlink(lockpath)
|
|
||||||
|
|
||||||
# Remove empty namespace paths
|
|
||||||
if len(locknames) == removed:
|
|
||||||
nspaths_removed += 1
|
|
||||||
|
|
||||||
if options.verbose:
|
|
||||||
print 'Removing empty namespace: %s' % nspath
|
|
||||||
|
|
||||||
if not options.dry_run:
|
|
||||||
try:
|
|
||||||
os.rmdir(nspath)
|
|
||||||
except OSError, e:
|
|
||||||
if e.errno == errno.ENOTEMPTY:
|
|
||||||
print >> sys.stderr, "warning: directory '%s'"\
|
|
||||||
" not empty" % nspath
|
|
||||||
else:
|
|
||||||
raise
|
|
||||||
|
|
||||||
if options.dry_run:
|
|
||||||
print "** Dry Run **"
|
|
||||||
|
|
||||||
print "Total locks removed: ", lockpaths_removed
|
|
||||||
print "Total namespaces removed: ", nspaths_removed
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
main()
|
|
@ -1,69 +0,0 @@
|
|||||||
"""
|
|
||||||
destroy_cached_images.py
|
|
||||||
|
|
||||||
This script is used to clean up Glance images that are cached in the SR. By
|
|
||||||
default, this script will only cleanup unused cached images.
|
|
||||||
|
|
||||||
Options:
|
|
||||||
|
|
||||||
--dry_run - Don't actually destroy the VDIs
|
|
||||||
--all_cached - Destroy all cached images instead of just unused cached
|
|
||||||
images.
|
|
||||||
"""
|
|
||||||
import eventlet
|
|
||||||
eventlet.monkey_patch()
|
|
||||||
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
|
|
||||||
from oslo.config import cfg
|
|
||||||
|
|
||||||
# If ../nova/__init__.py exists, add ../ to Python search path, so that
|
|
||||||
# it will override what happens to be installed in /usr/(local/)lib/python...
|
|
||||||
POSSIBLE_TOPDIR = os.path.normpath(os.path.join(os.path.abspath(sys.argv[0]),
|
|
||||||
os.pardir,
|
|
||||||
os.pardir,
|
|
||||||
os.pardir))
|
|
||||||
if os.path.exists(os.path.join(POSSIBLE_TOPDIR, 'nova', '__init__.py')):
|
|
||||||
sys.path.insert(0, POSSIBLE_TOPDIR)
|
|
||||||
|
|
||||||
from nova import config
|
|
||||||
from nova.openstack.common import log as logging
|
|
||||||
from nova import utils
|
|
||||||
from nova.virt.xenapi import driver as xenapi_driver
|
|
||||||
from nova.virt.xenapi import vm_utils
|
|
||||||
|
|
||||||
destroy_opts = [
|
|
||||||
cfg.BoolOpt('all_cached',
|
|
||||||
default=False,
|
|
||||||
help='Destroy all cached images instead of just unused cached'
|
|
||||||
' images.'),
|
|
||||||
cfg.BoolOpt('dry_run',
|
|
||||||
default=False,
|
|
||||||
help='Don\'t actually delete the VDIs.')
|
|
||||||
]
|
|
||||||
|
|
||||||
CONF = cfg.CONF
|
|
||||||
CONF.register_cli_opts(destroy_opts)
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
config.parse_args(sys.argv)
|
|
||||||
utils.monkey_patch()
|
|
||||||
|
|
||||||
xenapi = xenapi_driver.XenAPIDriver()
|
|
||||||
session = xenapi._session
|
|
||||||
|
|
||||||
sr_ref = vm_utils.safe_find_sr(session)
|
|
||||||
destroyed = vm_utils.destroy_cached_images(
|
|
||||||
session, sr_ref, all_cached=CONF.all_cached,
|
|
||||||
dry_run=CONF.dry_run)
|
|
||||||
|
|
||||||
if '--verbose' in sys.argv:
|
|
||||||
print '\n'.join(destroyed)
|
|
||||||
|
|
||||||
print "Destroyed %d cached VDIs" % len(destroyed)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
@ -1,172 +0,0 @@
|
|||||||
"""
|
|
||||||
This script concurrently builds and migrates instances. This can be useful when
|
|
||||||
troubleshooting race-conditions in virt-layer code.
|
|
||||||
|
|
||||||
Expects:
|
|
||||||
|
|
||||||
novarc to be sourced in the environment
|
|
||||||
|
|
||||||
Helper Script for Xen Dom0:
|
|
||||||
|
|
||||||
# cat /tmp/destroy_cache_vdis
|
|
||||||
#!/bin/bash
|
|
||||||
xe vdi-list | grep "Glance Image" -C1 | grep "^uuid" | awk '{print $5}' |
|
|
||||||
xargs -n1 -I{} xe vdi-destroy uuid={}
|
|
||||||
"""
|
|
||||||
import argparse
|
|
||||||
import contextlib
|
|
||||||
import multiprocessing
|
|
||||||
import subprocess
|
|
||||||
import sys
|
|
||||||
import time
|
|
||||||
|
|
||||||
DOM0_CLEANUP_SCRIPT = "/tmp/destroy_cache_vdis"
|
|
||||||
|
|
||||||
|
|
||||||
def run(cmd):
|
|
||||||
ret = subprocess.call(cmd, shell=True)
|
|
||||||
if ret != 0:
|
|
||||||
print >> sys.stderr, "Command exited non-zero: %s" % cmd
|
|
||||||
|
|
||||||
|
|
||||||
@contextlib.contextmanager
|
|
||||||
def server_built(server_name, image_name, flavor=1, cleanup=True):
|
|
||||||
run("nova boot --image=%(image_name)s --flavor=%(flavor)s"
|
|
||||||
" --poll %(server_name)s" % locals())
|
|
||||||
try:
|
|
||||||
yield
|
|
||||||
finally:
|
|
||||||
if cleanup:
|
|
||||||
run("nova delete %(server_name)s" % locals())
|
|
||||||
|
|
||||||
|
|
||||||
@contextlib.contextmanager
|
|
||||||
def snapshot_taken(server_name, snapshot_name, cleanup=True):
|
|
||||||
run("nova image-create %(server_name)s %(snapshot_name)s"
|
|
||||||
" --poll" % locals())
|
|
||||||
try:
|
|
||||||
yield
|
|
||||||
finally:
|
|
||||||
if cleanup:
|
|
||||||
run("nova image-delete %(snapshot_name)s" % locals())
|
|
||||||
|
|
||||||
|
|
||||||
def migrate_server(server_name):
|
|
||||||
run("nova migrate %(server_name)s --poll" % locals())
|
|
||||||
|
|
||||||
cmd = "nova list | grep %(server_name)s | awk '{print $6}'" % locals()
|
|
||||||
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, shell=True)
|
|
||||||
stdout, stderr = proc.communicate()
|
|
||||||
status = stdout.strip()
|
|
||||||
if status.upper() != 'VERIFY_RESIZE':
|
|
||||||
print >> sys.stderr, "Server %(server_name)s failed to rebuild"\
|
|
||||||
% locals()
|
|
||||||
return False
|
|
||||||
|
|
||||||
# Confirm the resize
|
|
||||||
run("nova resize-confirm %(server_name)s" % locals())
|
|
||||||
return True
|
|
||||||
|
|
||||||
|
|
||||||
def test_migrate(context):
|
|
||||||
count, args = context
|
|
||||||
server_name = "server%d" % count
|
|
||||||
cleanup = args.cleanup
|
|
||||||
with server_built(server_name, args.image, cleanup=cleanup):
|
|
||||||
# Migrate A -> B
|
|
||||||
result = migrate_server(server_name)
|
|
||||||
if not result:
|
|
||||||
return False
|
|
||||||
|
|
||||||
# Migrate B -> A
|
|
||||||
return migrate_server(server_name)
|
|
||||||
|
|
||||||
|
|
||||||
def rebuild_server(server_name, snapshot_name):
|
|
||||||
run("nova rebuild %(server_name)s %(snapshot_name)s --poll" % locals())
|
|
||||||
|
|
||||||
cmd = "nova list | grep %(server_name)s | awk '{print $6}'" % locals()
|
|
||||||
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, shell=True)
|
|
||||||
stdout, stderr = proc.communicate()
|
|
||||||
status = stdout.strip()
|
|
||||||
if status != 'ACTIVE':
|
|
||||||
print >> sys.stderr, "Server %(server_name)s failed to rebuild"\
|
|
||||||
% locals()
|
|
||||||
return False
|
|
||||||
|
|
||||||
return True
|
|
||||||
|
|
||||||
|
|
||||||
def test_rebuild(context):
|
|
||||||
count, args = context
|
|
||||||
server_name = "server%d" % count
|
|
||||||
snapshot_name = "snap%d" % count
|
|
||||||
cleanup = args.cleanup
|
|
||||||
with server_built(server_name, args.image, cleanup=cleanup):
|
|
||||||
with snapshot_taken(server_name, snapshot_name, cleanup=cleanup):
|
|
||||||
return rebuild_server(server_name, snapshot_name)
|
|
||||||
|
|
||||||
|
|
||||||
def _parse_args():
|
|
||||||
parser = argparse.ArgumentParser(
|
|
||||||
description='Test Nova for Race Conditions.')
|
|
||||||
|
|
||||||
parser.add_argument('tests', metavar='TESTS', type=str, nargs='*',
|
|
||||||
default=['rebuild', 'migrate'],
|
|
||||||
help='tests to run: [rebuilt|migrate]')
|
|
||||||
|
|
||||||
parser.add_argument('-i', '--image', help="image to build from",
|
|
||||||
required=True)
|
|
||||||
parser.add_argument('-n', '--num-runs', type=int, help="number of runs",
|
|
||||||
default=1)
|
|
||||||
parser.add_argument('-c', '--concurrency', type=int, default=5,
|
|
||||||
help="number of concurrent processes")
|
|
||||||
parser.add_argument('--no-cleanup', action='store_false', dest="cleanup",
|
|
||||||
default=True)
|
|
||||||
parser.add_argument('-d', '--dom0-ips',
|
|
||||||
help="IP of dom0's to run cleanup script")
|
|
||||||
|
|
||||||
return parser.parse_args()
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
dom0_cleanup_script = DOM0_CLEANUP_SCRIPT
|
|
||||||
args = _parse_args()
|
|
||||||
|
|
||||||
if args.dom0_ips:
|
|
||||||
dom0_ips = args.dom0_ips.split(',')
|
|
||||||
else:
|
|
||||||
dom0_ips = []
|
|
||||||
|
|
||||||
start_time = time.time()
|
|
||||||
batch_size = min(args.num_runs, args.concurrency)
|
|
||||||
pool = multiprocessing.Pool(processes=args.concurrency)
|
|
||||||
|
|
||||||
results = []
|
|
||||||
for test in args.tests:
|
|
||||||
test_func = globals().get("test_%s" % test)
|
|
||||||
if not test_func:
|
|
||||||
print >> sys.stderr, "test '%s' not found" % test
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
contexts = [(x, args) for x in range(args.num_runs)]
|
|
||||||
|
|
||||||
try:
|
|
||||||
results += pool.map(test_func, contexts)
|
|
||||||
finally:
|
|
||||||
if args.cleanup:
|
|
||||||
for dom0_ip in dom0_ips:
|
|
||||||
run("ssh root@%(dom0_ip)s %(dom0_cleanup_script)s"
|
|
||||||
% locals())
|
|
||||||
|
|
||||||
success = all(results)
|
|
||||||
result = "SUCCESS" if success else "FAILED"
|
|
||||||
|
|
||||||
duration = time.time() - start_time
|
|
||||||
print "%s, finished in %.2f secs" % (result, duration)
|
|
||||||
|
|
||||||
sys.exit(0 if success else 1)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
@ -1,128 +0,0 @@
|
|||||||
#!/usr/bin/env python
|
|
||||||
|
|
||||||
# Copyright 2012 OpenStack Foundation
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
|
||||||
# you may not use this file except in compliance with the License.
|
|
||||||
# You may obtain a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
|
||||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
||||||
# See the License for the specific language governing permissions and
|
|
||||||
# limitations under the License.
|
|
||||||
|
|
||||||
"""
|
|
||||||
This script is designed to cleanup any VHDs (and their descendents) which have
|
|
||||||
a bad parent pointer.
|
|
||||||
|
|
||||||
The script needs to be run in the dom0 of the affected host.
|
|
||||||
|
|
||||||
The available actions are:
|
|
||||||
|
|
||||||
- print: display the filenames of the affected VHDs
|
|
||||||
- delete: remove the affected VHDs
|
|
||||||
- move: move the affected VHDs out of the SR into another directory
|
|
||||||
"""
|
|
||||||
import glob
|
|
||||||
import os
|
|
||||||
import subprocess
|
|
||||||
import sys
|
|
||||||
|
|
||||||
|
|
||||||
class ExecutionFailed(Exception):
|
|
||||||
def __init__(self, returncode, stdout, stderr, max_stream_length=32):
|
|
||||||
self.returncode = returncode
|
|
||||||
self.stdout = stdout[:max_stream_length]
|
|
||||||
self.stderr = stderr[:max_stream_length]
|
|
||||||
self.max_stream_length = max_stream_length
|
|
||||||
|
|
||||||
def __repr__(self):
|
|
||||||
return "<ExecutionFailed returncode=%s out='%s' stderr='%s'>" % (
|
|
||||||
self.returncode, self.stdout, self.stderr)
|
|
||||||
|
|
||||||
__str__ = __repr__
|
|
||||||
|
|
||||||
|
|
||||||
def execute(cmd, ok_exit_codes=None):
|
|
||||||
if ok_exit_codes is None:
|
|
||||||
ok_exit_codes = [0]
|
|
||||||
|
|
||||||
proc = subprocess.Popen(
|
|
||||||
cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
|
|
||||||
|
|
||||||
(stdout, stderr) = proc.communicate()
|
|
||||||
|
|
||||||
if proc.returncode not in ok_exit_codes:
|
|
||||||
raise ExecutionFailed(proc.returncode, stdout, stderr)
|
|
||||||
|
|
||||||
return proc.returncode, stdout, stderr
|
|
||||||
|
|
||||||
|
|
||||||
def usage():
|
|
||||||
print "usage: %s <SR PATH> <print|delete|move>" % sys.argv[0]
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
if len(sys.argv) < 3:
|
|
||||||
usage()
|
|
||||||
|
|
||||||
sr_path = sys.argv[1]
|
|
||||||
action = sys.argv[2]
|
|
||||||
|
|
||||||
if action not in ('print', 'delete', 'move'):
|
|
||||||
usage()
|
|
||||||
|
|
||||||
if action == 'move':
|
|
||||||
if len(sys.argv) < 4:
|
|
||||||
print "error: must specify where to move bad VHDs"
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
bad_vhd_path = sys.argv[3]
|
|
||||||
if not os.path.exists(bad_vhd_path):
|
|
||||||
os.makedirs(bad_vhd_path)
|
|
||||||
|
|
||||||
bad_leaves = []
|
|
||||||
descendents = {}
|
|
||||||
|
|
||||||
for fname in glob.glob(os.path.join(sr_path, "*.vhd")):
|
|
||||||
(returncode, stdout, stderr) = execute(
|
|
||||||
['vhd-util', 'query', '-n', fname, '-p'], ok_exit_codes=[0, 22])
|
|
||||||
|
|
||||||
stdout = stdout.strip()
|
|
||||||
|
|
||||||
if stdout.endswith('.vhd'):
|
|
||||||
try:
|
|
||||||
descendents[stdout].append(fname)
|
|
||||||
except KeyError:
|
|
||||||
descendents[stdout] = [fname]
|
|
||||||
elif 'query failed' in stdout:
|
|
||||||
bad_leaves.append(fname)
|
|
||||||
|
|
||||||
def walk_vhds(root):
|
|
||||||
yield root
|
|
||||||
if root in descendents:
|
|
||||||
for child in descendents[root]:
|
|
||||||
for vhd in walk_vhds(child):
|
|
||||||
yield vhd
|
|
||||||
|
|
||||||
for bad_leaf in bad_leaves:
|
|
||||||
for bad_vhd in walk_vhds(bad_leaf):
|
|
||||||
print bad_vhd
|
|
||||||
if action == "print":
|
|
||||||
pass
|
|
||||||
elif action == "delete":
|
|
||||||
os.unlink(bad_vhd)
|
|
||||||
elif action == "move":
|
|
||||||
new_path = os.path.join(bad_vhd_path,
|
|
||||||
os.path.basename(bad_vhd))
|
|
||||||
os.rename(bad_vhd, new_path)
|
|
||||||
else:
|
|
||||||
raise Exception("invalid action %s" % action)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
main()
|
|
@ -1,329 +0,0 @@
|
|||||||
#!/usr/bin/env python
|
|
||||||
|
|
||||||
# Copyright 2011 OpenStack Foundation
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
|
||||||
# you may not use this file except in compliance with the License.
|
|
||||||
# You may obtain a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
|
||||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
||||||
# See the License for the specific language governing permissions and
|
|
||||||
# limitations under the License.
|
|
||||||
|
|
||||||
"""vm_vdi_cleaner.py - List or clean orphaned VDIs/instances on XenServer."""
|
|
||||||
|
|
||||||
import doctest
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
|
|
||||||
from oslo.config import cfg
|
|
||||||
import XenAPI
|
|
||||||
|
|
||||||
|
|
||||||
possible_topdir = os.getcwd()
|
|
||||||
if os.path.exists(os.path.join(possible_topdir, "nova", "__init__.py")):
|
|
||||||
sys.path.insert(0, possible_topdir)
|
|
||||||
|
|
||||||
|
|
||||||
from nova import context
|
|
||||||
from nova import db
|
|
||||||
from nova import exception
|
|
||||||
from nova.openstack.common import timeutils
|
|
||||||
from nova.virt import virtapi
|
|
||||||
from nova.virt.xenapi import driver as xenapi_driver
|
|
||||||
|
|
||||||
cleaner_opts = [
|
|
||||||
cfg.IntOpt('zombie_instance_updated_at_window',
|
|
||||||
default=172800,
|
|
||||||
help='Number of seconds zombie instances are cleaned up.'),
|
|
||||||
]
|
|
||||||
|
|
||||||
cli_opt = cfg.StrOpt('command',
|
|
||||||
default=None,
|
|
||||||
help='Cleaner command')
|
|
||||||
|
|
||||||
CONF = cfg.CONF
|
|
||||||
CONF.register_opts(cleaner_opts)
|
|
||||||
CONF.register_cli_opt(cli_opt)
|
|
||||||
CONF.import_opt('verbose', 'nova.openstack.common.log')
|
|
||||||
CONF.import_opt("resize_confirm_window", "nova.compute.manager")
|
|
||||||
|
|
||||||
|
|
||||||
ALLOWED_COMMANDS = ["list-vdis", "clean-vdis", "list-instances",
|
|
||||||
"clean-instances", "test"]
|
|
||||||
|
|
||||||
|
|
||||||
def call_xenapi(xenapi, method, *args):
|
|
||||||
"""Make a call to xapi."""
|
|
||||||
return xenapi._session.call_xenapi(method, *args)
|
|
||||||
|
|
||||||
|
|
||||||
def find_orphaned_instances(xenapi):
|
|
||||||
"""Find and return a list of orphaned instances."""
|
|
||||||
ctxt = context.get_admin_context(read_deleted="only")
|
|
||||||
|
|
||||||
orphaned_instances = []
|
|
||||||
|
|
||||||
for vm_ref, vm_rec in _get_applicable_vm_recs(xenapi):
|
|
||||||
try:
|
|
||||||
uuid = vm_rec['other_config']['nova_uuid']
|
|
||||||
instance = db.api.instance_get_by_uuid(ctxt, uuid)
|
|
||||||
except (KeyError, exception.InstanceNotFound):
|
|
||||||
# NOTE(jk0): Err on the side of caution here. If we don't know
|
|
||||||
# anything about the particular instance, ignore it.
|
|
||||||
print_xen_object("INFO: Ignoring VM", vm_rec, indent_level=0)
|
|
||||||
continue
|
|
||||||
|
|
||||||
# NOTE(jk0): This would be triggered if a VM was deleted but the
|
|
||||||
# actual deletion process failed somewhere along the line.
|
|
||||||
is_active_and_deleting = (instance.vm_state == "active" and
|
|
||||||
instance.task_state == "deleting")
|
|
||||||
|
|
||||||
# NOTE(jk0): A zombie VM is an instance that is not active and hasn't
|
|
||||||
# been updated in over the specified period.
|
|
||||||
is_zombie_vm = (instance.vm_state != "active"
|
|
||||||
and timeutils.is_older_than(instance.updated_at,
|
|
||||||
CONF.zombie_instance_updated_at_window))
|
|
||||||
|
|
||||||
if is_active_and_deleting or is_zombie_vm:
|
|
||||||
orphaned_instances.append((vm_ref, vm_rec, instance))
|
|
||||||
|
|
||||||
return orphaned_instances
|
|
||||||
|
|
||||||
|
|
||||||
def cleanup_instance(xenapi, instance, vm_ref, vm_rec):
|
|
||||||
"""Delete orphaned instances."""
|
|
||||||
xenapi._vmops._destroy(instance, vm_ref)
|
|
||||||
|
|
||||||
|
|
||||||
def _get_applicable_vm_recs(xenapi):
|
|
||||||
"""An 'applicable' VM is one that is not a template and not the control
|
|
||||||
domain.
|
|
||||||
"""
|
|
||||||
for vm_ref in call_xenapi(xenapi, 'VM.get_all'):
|
|
||||||
try:
|
|
||||||
vm_rec = call_xenapi(xenapi, 'VM.get_record', vm_ref)
|
|
||||||
except XenAPI.Failure, e:
|
|
||||||
if e.details[0] != 'HANDLE_INVALID':
|
|
||||||
raise
|
|
||||||
continue
|
|
||||||
|
|
||||||
if vm_rec["is_a_template"] or vm_rec["is_control_domain"]:
|
|
||||||
continue
|
|
||||||
yield vm_ref, vm_rec
|
|
||||||
|
|
||||||
|
|
||||||
def print_xen_object(obj_type, obj, indent_level=0, spaces_per_indent=4):
|
|
||||||
"""Pretty-print a Xen object.
|
|
||||||
|
|
||||||
Looks like:
|
|
||||||
|
|
||||||
VM (abcd-abcd-abcd): 'name label here'
|
|
||||||
"""
|
|
||||||
if not CONF.verbose:
|
|
||||||
return
|
|
||||||
uuid = obj["uuid"]
|
|
||||||
try:
|
|
||||||
name_label = obj["name_label"]
|
|
||||||
except KeyError:
|
|
||||||
name_label = ""
|
|
||||||
msg = "%(obj_type)s (%(uuid)s) '%(name_label)s'" % locals()
|
|
||||||
indent = " " * spaces_per_indent * indent_level
|
|
||||||
print "".join([indent, msg])
|
|
||||||
|
|
||||||
|
|
||||||
def _find_vdis_connected_to_vm(xenapi, connected_vdi_uuids):
|
|
||||||
"""Find VDIs which are connected to VBDs which are connected to VMs."""
|
|
||||||
def _is_null_ref(ref):
|
|
||||||
return ref == "OpaqueRef:NULL"
|
|
||||||
|
|
||||||
def _add_vdi_and_parents_to_connected(vdi_rec, indent_level):
|
|
||||||
indent_level += 1
|
|
||||||
|
|
||||||
vdi_and_parent_uuids = []
|
|
||||||
cur_vdi_rec = vdi_rec
|
|
||||||
while True:
|
|
||||||
cur_vdi_uuid = cur_vdi_rec["uuid"]
|
|
||||||
print_xen_object("VDI", vdi_rec, indent_level=indent_level)
|
|
||||||
connected_vdi_uuids.add(cur_vdi_uuid)
|
|
||||||
vdi_and_parent_uuids.append(cur_vdi_uuid)
|
|
||||||
|
|
||||||
try:
|
|
||||||
parent_vdi_uuid = vdi_rec["sm_config"]["vhd-parent"]
|
|
||||||
except KeyError:
|
|
||||||
parent_vdi_uuid = None
|
|
||||||
|
|
||||||
# NOTE(sirp): VDI's can have themselves as a parent?!
|
|
||||||
if parent_vdi_uuid and parent_vdi_uuid != cur_vdi_uuid:
|
|
||||||
indent_level += 1
|
|
||||||
cur_vdi_ref = call_xenapi(xenapi, 'VDI.get_by_uuid',
|
|
||||||
parent_vdi_uuid)
|
|
||||||
try:
|
|
||||||
cur_vdi_rec = call_xenapi(xenapi, 'VDI.get_record',
|
|
||||||
cur_vdi_ref)
|
|
||||||
except XenAPI.Failure, e:
|
|
||||||
if e.details[0] != 'HANDLE_INVALID':
|
|
||||||
raise
|
|
||||||
break
|
|
||||||
else:
|
|
||||||
break
|
|
||||||
|
|
||||||
for vm_ref, vm_rec in _get_applicable_vm_recs(xenapi):
|
|
||||||
indent_level = 0
|
|
||||||
print_xen_object("VM", vm_rec, indent_level=indent_level)
|
|
||||||
|
|
||||||
vbd_refs = vm_rec["VBDs"]
|
|
||||||
for vbd_ref in vbd_refs:
|
|
||||||
try:
|
|
||||||
vbd_rec = call_xenapi(xenapi, 'VBD.get_record', vbd_ref)
|
|
||||||
except XenAPI.Failure, e:
|
|
||||||
if e.details[0] != 'HANDLE_INVALID':
|
|
||||||
raise
|
|
||||||
continue
|
|
||||||
|
|
||||||
indent_level = 1
|
|
||||||
print_xen_object("VBD", vbd_rec, indent_level=indent_level)
|
|
||||||
|
|
||||||
vbd_vdi_ref = vbd_rec["VDI"]
|
|
||||||
|
|
||||||
if _is_null_ref(vbd_vdi_ref):
|
|
||||||
continue
|
|
||||||
|
|
||||||
try:
|
|
||||||
vdi_rec = call_xenapi(xenapi, 'VDI.get_record', vbd_vdi_ref)
|
|
||||||
except XenAPI.Failure, e:
|
|
||||||
if e.details[0] != 'HANDLE_INVALID':
|
|
||||||
raise
|
|
||||||
continue
|
|
||||||
|
|
||||||
_add_vdi_and_parents_to_connected(vdi_rec, indent_level)
|
|
||||||
|
|
||||||
|
|
||||||
def _find_all_vdis_and_system_vdis(xenapi, all_vdi_uuids, connected_vdi_uuids):
|
|
||||||
"""Collects all VDIs and adds system VDIs to the connected set."""
|
|
||||||
def _system_owned(vdi_rec):
|
|
||||||
vdi_name = vdi_rec["name_label"]
|
|
||||||
return (vdi_name.startswith("USB") or
|
|
||||||
vdi_name.endswith(".iso") or
|
|
||||||
vdi_rec["type"] == "system")
|
|
||||||
|
|
||||||
for vdi_ref in call_xenapi(xenapi, 'VDI.get_all'):
|
|
||||||
try:
|
|
||||||
vdi_rec = call_xenapi(xenapi, 'VDI.get_record', vdi_ref)
|
|
||||||
except XenAPI.Failure, e:
|
|
||||||
if e.details[0] != 'HANDLE_INVALID':
|
|
||||||
raise
|
|
||||||
continue
|
|
||||||
vdi_uuid = vdi_rec["uuid"]
|
|
||||||
all_vdi_uuids.add(vdi_uuid)
|
|
||||||
|
|
||||||
# System owned and non-managed VDIs should be considered 'connected'
|
|
||||||
# for our purposes.
|
|
||||||
if _system_owned(vdi_rec):
|
|
||||||
print_xen_object("SYSTEM VDI", vdi_rec, indent_level=0)
|
|
||||||
connected_vdi_uuids.add(vdi_uuid)
|
|
||||||
elif not vdi_rec["managed"]:
|
|
||||||
print_xen_object("UNMANAGED VDI", vdi_rec, indent_level=0)
|
|
||||||
connected_vdi_uuids.add(vdi_uuid)
|
|
||||||
|
|
||||||
|
|
||||||
def find_orphaned_vdi_uuids(xenapi):
|
|
||||||
"""Walk VM -> VBD -> VDI change and accumulate connected VDIs."""
|
|
||||||
connected_vdi_uuids = set()
|
|
||||||
|
|
||||||
_find_vdis_connected_to_vm(xenapi, connected_vdi_uuids)
|
|
||||||
|
|
||||||
all_vdi_uuids = set()
|
|
||||||
_find_all_vdis_and_system_vdis(xenapi, all_vdi_uuids, connected_vdi_uuids)
|
|
||||||
|
|
||||||
orphaned_vdi_uuids = all_vdi_uuids - connected_vdi_uuids
|
|
||||||
return orphaned_vdi_uuids
|
|
||||||
|
|
||||||
|
|
||||||
def list_orphaned_vdis(vdi_uuids):
|
|
||||||
"""List orphaned VDIs."""
|
|
||||||
for vdi_uuid in vdi_uuids:
|
|
||||||
if CONF.verbose:
|
|
||||||
print "ORPHANED VDI (%s)" % vdi_uuid
|
|
||||||
else:
|
|
||||||
print vdi_uuid
|
|
||||||
|
|
||||||
|
|
||||||
def clean_orphaned_vdis(xenapi, vdi_uuids):
|
|
||||||
"""Clean orphaned VDIs."""
|
|
||||||
for vdi_uuid in vdi_uuids:
|
|
||||||
if CONF.verbose:
|
|
||||||
print "CLEANING VDI (%s)" % vdi_uuid
|
|
||||||
|
|
||||||
vdi_ref = call_xenapi(xenapi, 'VDI.get_by_uuid', vdi_uuid)
|
|
||||||
try:
|
|
||||||
call_xenapi(xenapi, 'VDI.destroy', vdi_ref)
|
|
||||||
except XenAPI.Failure, exc:
|
|
||||||
print >> sys.stderr, "Skipping %s: %s" % (vdi_uuid, exc)
|
|
||||||
|
|
||||||
|
|
||||||
def list_orphaned_instances(orphaned_instances):
|
|
||||||
"""List orphaned instances."""
|
|
||||||
for vm_ref, vm_rec, orphaned_instance in orphaned_instances:
|
|
||||||
if CONF.verbose:
|
|
||||||
print "ORPHANED INSTANCE (%s)" % orphaned_instance.name
|
|
||||||
else:
|
|
||||||
print orphaned_instance.name
|
|
||||||
|
|
||||||
|
|
||||||
def clean_orphaned_instances(xenapi, orphaned_instances):
|
|
||||||
"""Clean orphaned instances."""
|
|
||||||
for vm_ref, vm_rec, instance in orphaned_instances:
|
|
||||||
if CONF.verbose:
|
|
||||||
print "CLEANING INSTANCE (%s)" % instance.name
|
|
||||||
|
|
||||||
cleanup_instance(xenapi, instance, vm_ref, vm_rec)
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
"""Main loop."""
|
|
||||||
args = CONF(args=sys.argv[1:], usage='%(prog)s [options] --command={' +
|
|
||||||
'|'.join(ALLOWED_COMMANDS) + '}')
|
|
||||||
|
|
||||||
command = CONF.command
|
|
||||||
if not command or command not in ALLOWED_COMMANDS:
|
|
||||||
CONF.print_usage()
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
if CONF.zombie_instance_updated_at_window < CONF.resize_confirm_window:
|
|
||||||
raise Exception("`zombie_instance_updated_at_window` has to be longer"
|
|
||||||
" than `resize_confirm_window`.")
|
|
||||||
|
|
||||||
# NOTE(blamar) This tool does not require DB access, so passing in the
|
|
||||||
# 'abstract' VirtAPI class is acceptable
|
|
||||||
xenapi = xenapi_driver.XenAPIDriver(virtapi.VirtAPI())
|
|
||||||
|
|
||||||
if command == "list-vdis":
|
|
||||||
if CONF.verbose:
|
|
||||||
print "Connected VDIs:\n"
|
|
||||||
orphaned_vdi_uuids = find_orphaned_vdi_uuids(xenapi)
|
|
||||||
if CONF.verbose:
|
|
||||||
print "\nOrphaned VDIs:\n"
|
|
||||||
list_orphaned_vdis(orphaned_vdi_uuids)
|
|
||||||
elif command == "clean-vdis":
|
|
||||||
orphaned_vdi_uuids = find_orphaned_vdi_uuids(xenapi)
|
|
||||||
clean_orphaned_vdis(xenapi, orphaned_vdi_uuids)
|
|
||||||
elif command == "list-instances":
|
|
||||||
orphaned_instances = find_orphaned_instances(xenapi)
|
|
||||||
list_orphaned_instances(orphaned_instances)
|
|
||||||
elif command == "clean-instances":
|
|
||||||
orphaned_instances = find_orphaned_instances(xenapi)
|
|
||||||
clean_orphaned_instances(xenapi, orphaned_instances)
|
|
||||||
elif command == "test":
|
|
||||||
doctest.testmod()
|
|
||||||
else:
|
|
||||||
print "Unknown command '%s'" % command
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
Loading…
Reference in New Issue
Block a user