Upating proxy-server StatsD logging.
Removed many StatsD logging calls in proxy-server and added swift-informant-style catch-all logging in the proxy-logger middleware. Many errors previously rolled into the "proxy-server.<type>.errors" counter will now appear broken down by response code and with timing data at: "proxy-server.<type>.<verb>.<status>.timing". Also, bytes transferred (sum of in + out) will be at: "proxy-server.<type>.<verb>.<status>.xfer". The proxy-logging middleware can get its StatsD config from standard vars in [DEFAULT] or from access_log_statsd_* config vars in its config section. Similarly to Swift Informant, request methods ("verbs") are filtered using the new proxy-logging config var, "log_statsd_valid_http_methods" which defaults to GET, HEAD, POST, PUT, DELETE, and COPY. Requests with methods not in this list use "BAD_METHOD" for <verb> in the metric name. To avoid user error, access_log_statsd_valid_http_methods is also accepted. Previously, proxy-server metrics used "Account", "Container", and "Object" for the <type>, but these are now all lowercase. Updated the admin guide's StatsD docs to reflect the above changes and also include the "proxy-server.<type>.handoff_count" and "proxy-server.<type>.handoff_all_count" metrics. The proxy server now saves off the original req.method and proxy_logging will use this if it can (both for request logging and as the "<verb>" in the statsd timing metric). This fixes bug 1025433. Removed some stale access_log_* related code in proxy/server.py. Also removed the BaseApplication/Application distinction as it's no longer necessary. Fixed up the sample config files a bit (logging lines, mostly). Fixed typo in SAIO development guide. Got proxy_logging.py test coverage to 100%. Fixed proxy_logging.py for PEP8 v1.3.2. Enhanced test.unit.FakeLogger to track more calls to enable testing StatsD metric calls. Change-Id: I45d94cb76450be96d66fcfab56359bdfdc3a2576
This commit is contained in:
parent
c509ac2371
commit
4a2ae2b460
@ -759,55 +759,57 @@ Metric Name Description
|
||||
`object-updater.failures` Count of failed continer updates.
|
||||
============================ ====================================================
|
||||
|
||||
Metrics for `proxy-server` (in the table, `<type>` may be `Account`, `Container`,
|
||||
or `Object`, and corresponds to the internal Controller object which handled the
|
||||
request):
|
||||
Metrics for `proxy-server` (in the table, `<type>` is the proxy-server
|
||||
controller responsible for the request and will be one of "account",
|
||||
"container", or "object"):
|
||||
|
||||
========================================= ====================================================
|
||||
Metric Name Description
|
||||
----------------------------------------- ----------------------------------------------------
|
||||
`proxy-server.errors` Count of errors encountered while serving requests
|
||||
before the controller type is determined. Includes
|
||||
invalid Content-Length, errors finding the internal
|
||||
controller to handle the request, invalid utf8, and
|
||||
bad URLs.
|
||||
`proxy-server.<type>.errors` Count of errors encountered after the controller
|
||||
type is known. The details of which responses are
|
||||
errors depend on the controller type and request
|
||||
type (GET, PUT, etc.). Failed
|
||||
authentication/authorization and "Not Found"
|
||||
responses are not counted as errors.
|
||||
`proxy-server.<type>.client_timeouts` Count of client timeouts (client did not read from
|
||||
queue within `client_timeout` seconds).
|
||||
`proxy-server.<type>.client_disconnects` Count of detected client disconnects.
|
||||
`proxy-server.<type>.method_not_allowed` Count of MethodNotAllowed responses sent by the
|
||||
`proxy-server.<type>.auth_short_circuits` Count of requests which short-circuited with an
|
||||
authentication/authorization failure.
|
||||
`proxy-server.<type>.GET.timing` Timing data for GET requests (excluding requests
|
||||
with errors or failed authentication/authorization).
|
||||
`proxy-server.<type>.HEAD.timing` Timing data for HEAD requests (excluding requests
|
||||
with errors or failed authentication/authorization).
|
||||
`proxy-server.<type>.POST.timing` Timing data for POST requests (excluding requests
|
||||
with errors or failed authentication/authorization).
|
||||
Requests with a client disconnect ARE included in
|
||||
the timing data.
|
||||
`proxy-server.<type>.PUT.timing` Timing data for PUT requests (excluding requests
|
||||
with errors or failed authentication/authorization).
|
||||
Account PUT requests which return MethodNotAllowed
|
||||
because allow_account_management is disabled ARE
|
||||
included.
|
||||
`proxy-server.<type>.DELETE.timing` Timing data for DELETE requests (excluding requests
|
||||
with errors or failed authentication/authorization).
|
||||
Account DELETE requests which return
|
||||
MethodNotAllowed because allow_account_management is
|
||||
disabled ARE included.
|
||||
`proxy-server.Object.COPY.timing` Timing data for object COPY requests (excluding
|
||||
requests with errors or failed
|
||||
authentication/authorization).
|
||||
========================================= ====================================================
|
||||
======================================== ====================================================
|
||||
Metric Name Description
|
||||
---------------------------------------- ----------------------------------------------------
|
||||
`proxy-server.errors` Count of errors encountered while serving requests
|
||||
before the controller type is determined. Includes
|
||||
invalid Content-Length, errors finding the internal
|
||||
controller to handle the request, invalid utf8, and
|
||||
bad URLs.
|
||||
`proxy-server.<type>.handoff_count` Count of node hand-offs; only tracked if log_handoffs
|
||||
is set in the proxy-server config.
|
||||
`proxy-server.<type>.handoff_all_count` Count of times *only* hand-off locations were
|
||||
utilized; only tracked if log_handoffs is set in the
|
||||
proxy-server config.
|
||||
`proxy-server.<type>.client_timeouts` Count of client timeouts (client did not read within
|
||||
`client_timeout` seconds during a GET or did not
|
||||
supply data within `client_timeout` seconds during
|
||||
a PUT).
|
||||
`proxy-server.<type>.client_disconnects` Count of detected client disconnects during PUT
|
||||
operations (does NOT include caught Exceptions in
|
||||
the proxy-server which caused a client disconnect).
|
||||
======================================== ====================================================
|
||||
|
||||
Metrics for `tempauth` (in the table, `<reseller_prefix>` represents the actual configured
|
||||
reseller_prefix or "`NONE`" if the reseller_prefix is the empty string):
|
||||
Metrics for `proxy-logging` middleware (in the table, `<type>` is either the
|
||||
proxy-server controller responsible for the request: "account", "container",
|
||||
"object", or the string "SOS" if the request came from the `Swift Origin Server`_
|
||||
middleware. The `<verb>` portion will be one of "GET", "HEAD", "POST", "PUT",
|
||||
"DELETE", "COPY", or "BAD_METHOD". The list of valid HTTP methods is
|
||||
configurable via the `log_statsd_valid_http_methods` config variable and the
|
||||
default setting yields the above behavior.
|
||||
|
||||
.. _Swift Origin Server: https://github.com/dpgoetz/sos
|
||||
|
||||
============================================ ====================================================
|
||||
Metric Name Description
|
||||
-------------------------------------------- ----------------------------------------------------
|
||||
`proxy-server.<type>.<verb>.<status>.timing` Timing data for requests. The <status> portion is
|
||||
the numeric HTTP status code for the request (eg.
|
||||
"200" or "404")
|
||||
`proxy-server.<type>.<verb>.<status>.xfer` The count of the sum of bytes transferred in (from
|
||||
clients) and out (to clients) for requests. The
|
||||
<type>, <verb>, and <status> portions of the metric
|
||||
are just like the timing metric.
|
||||
============================================ ====================================================
|
||||
|
||||
Metrics for `tempauth` middleware (in the table, `<reseller_prefix>` represents
|
||||
the actual configured reseller_prefix or "`NONE`" if the reseller_prefix is the
|
||||
empty string):
|
||||
|
||||
========================================= ====================================================
|
||||
Metric Name Description
|
||||
|
@ -680,7 +680,7 @@ Setting up scripts for running Swift
|
||||
sudo chown <your-user-name>:<your-group-name> /mnt/sdb1/*
|
||||
mkdir -p /srv/1/node/sdb1 /srv/2/node/sdb2 /srv/3/node/sdb3 /srv/4/node/sdb4
|
||||
sudo rm -f /var/log/debug /var/log/messages /var/log/rsyncd.log /var/log/syslog
|
||||
find /var/cache/swift* -type f -name *.recon -exec -rm -f {} \;
|
||||
find /var/cache/swift* -type f -name *.recon -exec rm -f {} \;
|
||||
sudo service rsyslog restart
|
||||
sudo service memcached restart
|
||||
|
||||
|
@ -13,8 +13,10 @@
|
||||
# log_facility = LOG_LOCAL0
|
||||
# log_level = INFO
|
||||
# log_address = /dev/log
|
||||
# You can enable default statsD logging here and/or override it in sections
|
||||
# below:
|
||||
# If set, log_udp_host will override log_address
|
||||
# log_udp_host =
|
||||
# log_udp_port = 514
|
||||
# You can enable StatsD logging here:
|
||||
# log_statsd_host = localhost
|
||||
# log_statsd_port = 8125
|
||||
# log_statsd_default_sample_rate = 1
|
||||
|
@ -16,8 +16,10 @@
|
||||
# log_facility = LOG_LOCAL0
|
||||
# log_level = INFO
|
||||
# log_address = /dev/log
|
||||
# You can enable default statsD logging here and/or override it in sections
|
||||
# below:
|
||||
# If set, log_udp_host will override log_address
|
||||
# log_udp_host =
|
||||
# log_udp_port = 514
|
||||
# You can enable StatsD logging here:
|
||||
# log_statsd_host = localhost
|
||||
# log_statsd_port = 8125
|
||||
# log_statsd_default_sample_rate = 1
|
||||
|
@ -6,7 +6,10 @@
|
||||
# log_facility = LOG_LOCAL0
|
||||
# log_level = INFO
|
||||
# log_address = /dev/log
|
||||
# You can enable default statsD logging here if you want:
|
||||
# If set, log_udp_host will override log_address
|
||||
# log_udp_host =
|
||||
# log_udp_port = 514
|
||||
# You can enable StatsD logging here:
|
||||
# log_statsd_host = localhost
|
||||
# log_statsd_port = 8125
|
||||
# log_statsd_default_sample_rate = 1
|
||||
|
@ -14,8 +14,10 @@
|
||||
# log_facility = LOG_LOCAL0
|
||||
# log_level = INFO
|
||||
# log_address = /dev/log
|
||||
# You can enable default statsD logging here and/or override it in sections
|
||||
# below:
|
||||
# If set, log_udp_host will override log_address
|
||||
# log_udp_host =
|
||||
# log_udp_port = 514
|
||||
# You can enable StatsD logging here:
|
||||
# log_statsd_host = localhost
|
||||
# log_statsd_port = 8125
|
||||
# log_statsd_default_sample_rate = 1
|
||||
|
@ -13,9 +13,12 @@
|
||||
# log_name = swift
|
||||
# log_facility = LOG_LOCAL0
|
||||
# log_level = INFO
|
||||
# log_headers = False
|
||||
# log_address = /dev/log
|
||||
# You can enable default statsD logging here and/or override it in sections
|
||||
# below:
|
||||
# If set, log_udp_host will override log_address
|
||||
# log_udp_host =
|
||||
# log_udp_port = 514
|
||||
# You can enable StatsD logging here:
|
||||
# log_statsd_host = localhost
|
||||
# log_statsd_port = 8125
|
||||
# log_statsd_default_sample_rate = 1
|
||||
@ -31,11 +34,7 @@ use = egg:swift#proxy
|
||||
# set log_facility = LOG_LOCAL0
|
||||
# set log_level = INFO
|
||||
# set log_address = /dev/log
|
||||
# set access_log_name = proxy-server
|
||||
# set access_log_facility = LOG_LOCAL0
|
||||
# set access_log_level = INFO
|
||||
# set log_headers = False
|
||||
# set log_handoffs = True
|
||||
# log_handoffs = True
|
||||
# recheck_account_existence = 60
|
||||
# recheck_container_existence = 60
|
||||
# object_chunk_size = 8192
|
||||
@ -306,3 +305,20 @@ use = egg:swift#name_check
|
||||
|
||||
[filter:proxy-logging]
|
||||
use = egg:swift#proxy_logging
|
||||
# If not set, logging directives from [DEFAULT] without "access_" will be # used
|
||||
# access_log_name = swift
|
||||
# access_log_facility = LOG_LOCAL0
|
||||
# access_log_level = INFO
|
||||
# access_log_address = /dev/log
|
||||
# If set, access_log_udp_host will override access_log_address
|
||||
# access_log_udp_host =
|
||||
# access_log_udp_port = 514
|
||||
# You can use log_statsd_* from [DEFAULT] or override them here:
|
||||
# access_log_statsd_host = localhost
|
||||
# access_log_statsd_port = 8125
|
||||
# access_log_statsd_default_sample_rate = 1
|
||||
# access_log_statsd_metric_prefix =
|
||||
# access_log_headers = False
|
||||
# What HTTP methods are allowed for StatsD logging (comma-sep); request methods
|
||||
# not in this list will have "BAD_METHOD" for the <verb> portion of the metric.
|
||||
# log_statsd_valid_http_methods = GET,HEAD,POST,PUT,DELETE,COPY
|
||||
|
@ -94,14 +94,27 @@ class ProxyLoggingMiddleware(object):
|
||||
def __init__(self, app, conf):
|
||||
self.app = app
|
||||
self.log_hdrs = conf.get('log_headers', 'no').lower() in TRUE_VALUES
|
||||
|
||||
# The leading access_* check is in case someone assumes that
|
||||
# log_statsd_valid_http_methods behaves like the other log_statsd_*
|
||||
# settings.
|
||||
self.valid_methods = conf.get(
|
||||
'access_log_statsd_valid_http_methods',
|
||||
conf.get('log_statsd_valid_http_methods',
|
||||
'GET,HEAD,POST,PUT,DELETE,COPY'))
|
||||
self.valid_methods = [m.strip().upper() for m in
|
||||
self.valid_methods.split(',') if m.strip()]
|
||||
access_log_conf = {}
|
||||
for key in ('log_facility', 'log_name', 'log_level', 'log_udp_host',
|
||||
'log_udp_port'):
|
||||
'log_udp_port', 'log_statsd_host', 'log_statsd_port',
|
||||
'log_statsd_default_sample_rate',
|
||||
'log_statsd_metric_prefix'):
|
||||
value = conf.get('access_' + key, conf.get(key, None))
|
||||
if value:
|
||||
access_log_conf[key] = value
|
||||
self.access_logger = get_logger(access_log_conf,
|
||||
log_route='proxy-access')
|
||||
log_route='proxy-access')
|
||||
self.access_logger.set_statsd_prefix('proxy-server')
|
||||
|
||||
def log_request(self, env, status_int, bytes_received, bytes_sent,
|
||||
request_time, client_disconnect):
|
||||
@ -124,13 +137,15 @@ class ProxyLoggingMiddleware(object):
|
||||
logged_headers = None
|
||||
if self.log_hdrs:
|
||||
logged_headers = '\n'.join('%s: %s' % (k, v)
|
||||
for k, v in req.headers.items())
|
||||
self.access_logger.info(' '.join(quote(str(x) if x else '-')
|
||||
for k, v in req.headers.items())
|
||||
method = req.environ.get('swift.orig_req_method', req.method)
|
||||
self.access_logger.info(' '.join(
|
||||
quote(str(x) if x else '-')
|
||||
for x in (
|
||||
get_remote_client(req),
|
||||
req.remote_addr,
|
||||
time.strftime('%d/%b/%Y/%H/%M/%S', time.gmtime()),
|
||||
req.method,
|
||||
method,
|
||||
the_request,
|
||||
req.environ.get('SERVER_PROTOCOL'),
|
||||
status_int,
|
||||
@ -145,7 +160,25 @@ class ProxyLoggingMiddleware(object):
|
||||
'%.4f' % request_time,
|
||||
req.environ.get('swift.source'),
|
||||
)))
|
||||
self.access_logger.txn_id = None
|
||||
# Log timing and bytes-transfered data to StatsD
|
||||
if req.path.startswith('/v1/'):
|
||||
try:
|
||||
stat_type = [None, 'account', 'container',
|
||||
'object'][req.path.strip('/').count('/')]
|
||||
except IndexError:
|
||||
stat_type = 'object'
|
||||
else:
|
||||
stat_type = env.get('swift.source')
|
||||
# Only log data for valid controllers (or SOS) to keep the metric count
|
||||
# down (egregious errors will get logged by the proxy server itself).
|
||||
if stat_type:
|
||||
stat_method = method if method in self.valid_methods \
|
||||
else 'BAD_METHOD'
|
||||
metric_name = '.'.join((stat_type, stat_method, str(status_int)))
|
||||
self.access_logger.timing(metric_name + '.timing',
|
||||
request_time * 1000)
|
||||
self.access_logger.update_stats(metric_name + '.xfer',
|
||||
bytes_received + bytes_sent)
|
||||
|
||||
def __call__(self, env, start_response):
|
||||
start_response_args = [None]
|
||||
@ -175,8 +208,8 @@ class ProxyLoggingMiddleware(object):
|
||||
('content-length', str(sum(len(i) for i in iterable))))
|
||||
else:
|
||||
raise Exception('WSGI [proxy-logging]: No content-length '
|
||||
'or transfer-encoding header sent and there is '
|
||||
'content! %r' % chunk)
|
||||
'or transfer-encoding header sent and '
|
||||
'there is content! %r' % chunk)
|
||||
start_response(*start_response_args[0])
|
||||
bytes_sent = 0
|
||||
client_disconnect = False
|
||||
@ -190,16 +223,17 @@ class ProxyLoggingMiddleware(object):
|
||||
raise
|
||||
finally:
|
||||
status_int = int(start_response_args[0][0].split(' ', 1)[0])
|
||||
self.log_request(env, status_int,
|
||||
input_proxy.bytes_received, bytes_sent,
|
||||
time.time() - start_time,
|
||||
client_disconnect or input_proxy.client_disconnect)
|
||||
self.log_request(
|
||||
env, status_int, input_proxy.bytes_received, bytes_sent,
|
||||
time.time() - start_time,
|
||||
client_disconnect or input_proxy.client_disconnect)
|
||||
|
||||
try:
|
||||
iterable = self.app(env, my_start_response)
|
||||
except Exception:
|
||||
self.log_request(env, 500, input_proxy.bytes_received, 0,
|
||||
time.time() - start_time, input_proxy.client_disconnect)
|
||||
self.log_request(
|
||||
env, 500, input_proxy.bytes_received, 0,
|
||||
time.time() - start_time, input_proxy.client_disconnect)
|
||||
raise
|
||||
else:
|
||||
return iter_response(iterable)
|
||||
|
@ -523,7 +523,7 @@ class StaticWeb(object):
|
||||
self.access_logger = get_logger(access_log_conf,
|
||||
log_route='staticweb-access')
|
||||
#: Indicates whether full HTTP headers should be logged or not.
|
||||
self.log_headers = conf.get('log_headers', 'f').lower() in TRUE_VALUES
|
||||
self.log_headers = conf.get('log_headers', 'no').lower() in TRUE_VALUES
|
||||
|
||||
def __call__(self, env, start_response):
|
||||
"""
|
||||
|
@ -561,9 +561,11 @@ def get_logger(conf, name=None, log_to_console=False, log_route=None,
|
||||
log_name = swift
|
||||
log_udp_host = (disabled)
|
||||
log_udp_port = logging.handlers.SYSLOG_UDP_PORT
|
||||
log_address = /dev/log
|
||||
log_statsd_host = (disabled)
|
||||
log_statsd_port = 8125
|
||||
log_statsd_default_sample_rate = 1
|
||||
log_statsd_metric_prefix = (empty-string)
|
||||
|
||||
:param conf: Configuration dict to read settings from
|
||||
:param name: Name of the logger
|
||||
|
@ -45,9 +45,8 @@ class AccountController(Controller):
|
||||
Controller.__init__(self, app)
|
||||
self.account_name = unquote(account_name)
|
||||
|
||||
def GETorHEAD(self, req, stats_type):
|
||||
def GETorHEAD(self, req):
|
||||
"""Handler for HTTP GET/HEAD requests."""
|
||||
start_time = time.time()
|
||||
partition, nodes = self.app.account_ring.get_nodes(self.account_name)
|
||||
shuffle(nodes)
|
||||
resp = self.GETorHEAD_base(req, _('Account'), partition, nodes,
|
||||
@ -57,8 +56,6 @@ class AccountController(Controller):
|
||||
resp = HTTPBadRequest(request=req)
|
||||
resp.body = 'Account name length of %d longer than %d' % \
|
||||
(len(self.account_name), MAX_ACCOUNT_NAME_LENGTH)
|
||||
self.app.logger.timing_since(
|
||||
'%s.timing' % (stats_type,), start_time)
|
||||
return resp
|
||||
headers = {'X-Timestamp': normalize_timestamp(time.time()),
|
||||
'X-Trans-Id': self.trans_id,
|
||||
@ -73,25 +70,20 @@ class AccountController(Controller):
|
||||
return resp
|
||||
resp = self.GETorHEAD_base(req, _('Account'), partition, nodes,
|
||||
req.path_info.rstrip('/'), len(nodes))
|
||||
self.app.logger.timing_since('%s.timing' % (stats_type,), start_time)
|
||||
return resp
|
||||
|
||||
@public
|
||||
def PUT(self, req):
|
||||
"""HTTP PUT request handler."""
|
||||
start_time = time.time()
|
||||
if not self.app.allow_account_management:
|
||||
self.app.logger.timing_since('PUT.timing', start_time)
|
||||
return HTTPMethodNotAllowed(request=req)
|
||||
error_response = check_metadata(req, 'account')
|
||||
if error_response:
|
||||
self.app.logger.increment('errors')
|
||||
return error_response
|
||||
if len(self.account_name) > MAX_ACCOUNT_NAME_LENGTH:
|
||||
resp = HTTPBadRequest(request=req)
|
||||
resp.body = 'Account name length of %d longer than %d' % \
|
||||
(len(self.account_name), MAX_ACCOUNT_NAME_LENGTH)
|
||||
self.app.logger.increment('errors')
|
||||
return resp
|
||||
account_partition, accounts = \
|
||||
self.app.account_ring.get_nodes(self.account_name)
|
||||
@ -103,16 +95,13 @@ class AccountController(Controller):
|
||||
self.app.memcache.delete('account%s' % req.path_info.rstrip('/'))
|
||||
resp = self.make_requests(req, self.app.account_ring,
|
||||
account_partition, 'PUT', req.path_info, [headers] * len(accounts))
|
||||
self.app.logger.timing_since('PUT.timing', start_time)
|
||||
return resp
|
||||
|
||||
@public
|
||||
def POST(self, req):
|
||||
"""HTTP POST request handler."""
|
||||
start_time = time.time()
|
||||
error_response = check_metadata(req, 'account')
|
||||
if error_response:
|
||||
self.app.logger.increment('errors')
|
||||
return error_response
|
||||
account_partition, accounts = \
|
||||
self.app.account_ring.get_nodes(self.account_name)
|
||||
@ -130,7 +119,6 @@ class AccountController(Controller):
|
||||
resp = HTTPBadRequest(request=req)
|
||||
resp.body = 'Account name length of %d longer than %d' % \
|
||||
(len(self.account_name), MAX_ACCOUNT_NAME_LENGTH)
|
||||
self.app.logger.increment('errors')
|
||||
return resp
|
||||
resp = self.make_requests(
|
||||
Request.blank('/v1/' + self.account_name),
|
||||
@ -140,15 +128,12 @@ class AccountController(Controller):
|
||||
self.app.logger.warning('Could not autocreate account %r' %
|
||||
self.account_name)
|
||||
return resp
|
||||
self.app.logger.timing_since('POST.timing', start_time)
|
||||
return resp
|
||||
|
||||
@public
|
||||
def DELETE(self, req):
|
||||
"""HTTP DELETE request handler."""
|
||||
start_time = time.time()
|
||||
if not self.app.allow_account_management:
|
||||
self.app.logger.timing_since('DELETE.timing', start_time)
|
||||
return HTTPMethodNotAllowed(request=req)
|
||||
account_partition, accounts = \
|
||||
self.app.account_ring.get_nodes(self.account_name)
|
||||
@ -160,5 +145,4 @@ class AccountController(Controller):
|
||||
resp = self.make_requests(req, self.app.account_ring,
|
||||
account_partition, 'DELETE', req.path_info,
|
||||
[headers] * len(accounts))
|
||||
self.app.logger.timing_since('DELETE.timing', start_time)
|
||||
return resp
|
||||
|
@ -30,8 +30,7 @@ import functools
|
||||
from eventlet import spawn_n, GreenPile, Timeout
|
||||
from eventlet.queue import Queue, Empty, Full
|
||||
from eventlet.timeout import Timeout
|
||||
from webob.exc import \
|
||||
status_map
|
||||
from webob.exc import status_map
|
||||
from webob import Request, Response
|
||||
|
||||
from swift.common.utils import normalize_timestamp, TRUE_VALUES, public
|
||||
@ -457,12 +456,12 @@ class Controller(object):
|
||||
@public
|
||||
def GET(self, req):
|
||||
"""Handler for HTTP GET requests."""
|
||||
return self.GETorHEAD(req, stats_type='GET')
|
||||
return self.GETorHEAD(req)
|
||||
|
||||
@public
|
||||
def HEAD(self, req):
|
||||
"""Handler for HTTP HEAD requests."""
|
||||
return self.GETorHEAD(req, stats_type='HEAD')
|
||||
return self.GETorHEAD(req)
|
||||
|
||||
def _make_app_iter_reader(self, node, source, queue, logger_thread_locals):
|
||||
"""
|
||||
|
@ -28,8 +28,7 @@ import time
|
||||
from urllib import unquote
|
||||
from random import shuffle
|
||||
|
||||
from webob.exc import HTTPBadRequest, HTTPForbidden, \
|
||||
HTTPNotFound
|
||||
from webob.exc import HTTPBadRequest, HTTPForbidden, HTTPNotFound
|
||||
|
||||
from swift.common.utils import normalize_timestamp, public
|
||||
from swift.common.constraints import check_metadata, MAX_CONTAINER_NAME_LENGTH
|
||||
@ -64,12 +63,9 @@ class ContainerController(Controller):
|
||||
return HTTPBadRequest(request=req, body=str(err))
|
||||
return None
|
||||
|
||||
def GETorHEAD(self, req, stats_type):
|
||||
def GETorHEAD(self, req):
|
||||
"""Handler for HTTP GET/HEAD requests."""
|
||||
start_time = time.time()
|
||||
if not self.account_info(self.account_name)[1]:
|
||||
self.app.logger.timing_since(
|
||||
'%s.timing' % (stats_type,), start_time)
|
||||
return HTTPNotFound(request=req)
|
||||
part, nodes = self.app.container_ring.get_nodes(
|
||||
self.account_name, self.container_name)
|
||||
@ -94,42 +90,37 @@ class ContainerController(Controller):
|
||||
req.acl = resp.headers.get('x-container-read')
|
||||
aresp = req.environ['swift.authorize'](req)
|
||||
if aresp:
|
||||
self.app.logger.increment('auth_short_circuits')
|
||||
return aresp
|
||||
if not req.environ.get('swift_owner', False):
|
||||
for key in ('x-container-read', 'x-container-write',
|
||||
'x-container-sync-key', 'x-container-sync-to'):
|
||||
if key in resp.headers:
|
||||
del resp.headers[key]
|
||||
self.app.logger.timing_since('%s.timing' % (stats_type,), start_time)
|
||||
return resp
|
||||
|
||||
@public
|
||||
@delay_denial
|
||||
def GET(self, req):
|
||||
"""Handler for HTTP GET requests."""
|
||||
return self.GETorHEAD(req, stats_type='GET')
|
||||
return self.GETorHEAD(req)
|
||||
|
||||
@public
|
||||
@delay_denial
|
||||
def HEAD(self, req):
|
||||
"""Handler for HTTP HEAD requests."""
|
||||
return self.GETorHEAD(req, stats_type='HEAD')
|
||||
return self.GETorHEAD(req)
|
||||
|
||||
@public
|
||||
def PUT(self, req):
|
||||
"""HTTP PUT request handler."""
|
||||
start_time = time.time()
|
||||
error_response = \
|
||||
self.clean_acls(req) or check_metadata(req, 'container')
|
||||
if error_response:
|
||||
self.app.logger.increment('errors')
|
||||
return error_response
|
||||
if len(self.container_name) > MAX_CONTAINER_NAME_LENGTH:
|
||||
resp = HTTPBadRequest(request=req)
|
||||
resp.body = 'Container name length of %d longer than %d' % \
|
||||
(len(self.container_name), MAX_CONTAINER_NAME_LENGTH)
|
||||
self.app.logger.increment('errors')
|
||||
return resp
|
||||
account_partition, accounts, container_count = \
|
||||
self.account_info(self.account_name,
|
||||
@ -142,7 +133,6 @@ class ContainerController(Controller):
|
||||
self.app.max_containers_per_account
|
||||
return resp
|
||||
if not accounts:
|
||||
self.app.logger.timing_since('PUT.timing', start_time)
|
||||
return HTTPNotFound(request=req)
|
||||
container_partition, containers = self.app.container_ring.get_nodes(
|
||||
self.account_name, self.container_name)
|
||||
@ -162,23 +152,19 @@ class ContainerController(Controller):
|
||||
self.app.memcache.delete(cache_key)
|
||||
resp = self.make_requests(req, self.app.container_ring,
|
||||
container_partition, 'PUT', req.path_info, headers)
|
||||
self.app.logger.timing_since('PUT.timing', start_time)
|
||||
return resp
|
||||
|
||||
@public
|
||||
def POST(self, req):
|
||||
"""HTTP POST request handler."""
|
||||
start_time = time.time()
|
||||
error_response = \
|
||||
self.clean_acls(req) or check_metadata(req, 'container')
|
||||
if error_response:
|
||||
self.app.logger.increment('errors')
|
||||
return error_response
|
||||
account_partition, accounts, container_count = \
|
||||
self.account_info(self.account_name,
|
||||
autocreate=self.app.account_autocreate)
|
||||
if not accounts:
|
||||
self.app.logger.timing_since('POST.timing', start_time)
|
||||
return HTTPNotFound(request=req)
|
||||
container_partition, containers = self.app.container_ring.get_nodes(
|
||||
self.account_name, self.container_name)
|
||||
@ -193,17 +179,14 @@ class ContainerController(Controller):
|
||||
resp = self.make_requests(req, self.app.container_ring,
|
||||
container_partition, 'POST', req.path_info,
|
||||
[headers] * len(containers))
|
||||
self.app.logger.timing_since('POST.timing', start_time)
|
||||
return resp
|
||||
|
||||
@public
|
||||
def DELETE(self, req):
|
||||
"""HTTP DELETE request handler."""
|
||||
start_time = time.time()
|
||||
account_partition, accounts, container_count = \
|
||||
self.account_info(self.account_name)
|
||||
if not accounts:
|
||||
self.app.logger.timing_since('DELETE.timing', start_time)
|
||||
return HTTPNotFound(request=req)
|
||||
container_partition, containers = self.app.container_ring.get_nodes(
|
||||
self.account_name, self.container_name)
|
||||
@ -222,7 +205,6 @@ class ContainerController(Controller):
|
||||
resp = self.make_requests(req, self.app.container_ring,
|
||||
container_partition, 'DELETE', req.path_info, headers)
|
||||
# Indicates no server had the container
|
||||
self.app.logger.timing_since('DELETE.timing', start_time)
|
||||
if resp.status_int == HTTP_ACCEPTED:
|
||||
return HTTPNotFound(request=req)
|
||||
return resp
|
||||
|
@ -276,15 +276,13 @@ class ObjectController(Controller):
|
||||
for obj in sublisting:
|
||||
yield obj
|
||||
|
||||
def GETorHEAD(self, req, stats_type):
|
||||
def GETorHEAD(self, req):
|
||||
"""Handle HTTP GET or HEAD requests."""
|
||||
start_time = time.time()
|
||||
_junk, _junk, req.acl, _junk, _junk, object_versions = \
|
||||
self.container_info(self.account_name, self.container_name)
|
||||
if 'swift.authorize' in req.environ:
|
||||
aresp = req.environ['swift.authorize'](req)
|
||||
if aresp:
|
||||
self.app.logger.increment('auth_short_circuits')
|
||||
return aresp
|
||||
partition, nodes = self.app.object_ring.get_nodes(
|
||||
self.account_name, self.container_name, self.object_name)
|
||||
@ -305,8 +303,6 @@ class ObjectController(Controller):
|
||||
self.app.object_ring),
|
||||
req.path_info, len(nodes))
|
||||
if 'x-object-manifest' not in resp2.headers:
|
||||
self.app.logger.timing_since(
|
||||
'%s.timing' % (stats_type,), start_time)
|
||||
return resp
|
||||
resp = resp2
|
||||
req.range = str(req_range)
|
||||
@ -320,14 +316,10 @@ class ObjectController(Controller):
|
||||
listing = list(self._listing_iter(lcontainer, lprefix,
|
||||
req.environ))
|
||||
except ListingIterNotFound:
|
||||
self.app.logger.timing_since(
|
||||
'%s.timing' % (stats_type,), start_time)
|
||||
return HTTPNotFound(request=req)
|
||||
except ListingIterNotAuthorized, err:
|
||||
self.app.logger.increment('auth_short_circuits')
|
||||
return err.aresp
|
||||
except ListingIterError:
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPServerError(request=req)
|
||||
|
||||
if len(listing) > CONTAINER_LISTING_LIMIT:
|
||||
@ -346,8 +338,6 @@ class ObjectController(Controller):
|
||||
return iter([])
|
||||
|
||||
head_response.status_int = resp.status_int
|
||||
self.app.logger.timing_since(
|
||||
'%s.timing' % (stats_type,), start_time)
|
||||
return head_response
|
||||
else:
|
||||
resp.app_iter = SegmentedIterable(self, lcontainer,
|
||||
@ -377,31 +367,28 @@ class ObjectController(Controller):
|
||||
resp.etag = etag
|
||||
resp.headers['accept-ranges'] = 'bytes'
|
||||
|
||||
self.app.logger.timing_since('%s.timing' % (stats_type,), start_time)
|
||||
return resp
|
||||
|
||||
@public
|
||||
@delay_denial
|
||||
def GET(self, req):
|
||||
"""Handler for HTTP GET requests."""
|
||||
return self.GETorHEAD(req, stats_type='GET')
|
||||
return self.GETorHEAD(req)
|
||||
|
||||
@public
|
||||
@delay_denial
|
||||
def HEAD(self, req):
|
||||
"""Handler for HTTP HEAD requests."""
|
||||
return self.GETorHEAD(req, stats_type='HEAD')
|
||||
return self.GETorHEAD(req)
|
||||
|
||||
@public
|
||||
@delay_denial
|
||||
def POST(self, req):
|
||||
"""HTTP POST request handler."""
|
||||
start_time = time.time()
|
||||
if 'x-delete-after' in req.headers:
|
||||
try:
|
||||
x_delete_after = int(req.headers['x-delete-after'])
|
||||
except ValueError:
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPBadRequest(request=req,
|
||||
content_type='text/plain',
|
||||
body='Non-integer X-Delete-After')
|
||||
@ -415,7 +402,7 @@ class ObjectController(Controller):
|
||||
self.object_name))
|
||||
req.headers['X-Fresh-Metadata'] = 'true'
|
||||
req.environ['swift_versioned_copy'] = True
|
||||
resp = self.PUT(req, start_time=start_time, stats_type='POST')
|
||||
resp = self.PUT(req)
|
||||
# Older editions returned 202 Accepted on object POSTs, so we'll
|
||||
# convert any 201 Created responses to that for compatibility with
|
||||
# picky clients.
|
||||
@ -425,7 +412,6 @@ class ObjectController(Controller):
|
||||
else:
|
||||
error_response = check_metadata(req, 'object')
|
||||
if error_response:
|
||||
self.app.logger.increment('errors')
|
||||
return error_response
|
||||
container_partition, containers, _junk, req.acl, _junk, _junk = \
|
||||
self.container_info(self.account_name, self.container_name,
|
||||
@ -433,20 +419,16 @@ class ObjectController(Controller):
|
||||
if 'swift.authorize' in req.environ:
|
||||
aresp = req.environ['swift.authorize'](req)
|
||||
if aresp:
|
||||
self.app.logger.increment('auth_short_circuits')
|
||||
return aresp
|
||||
if not containers:
|
||||
self.app.logger.timing_since('POST.timing', start_time)
|
||||
return HTTPNotFound(request=req)
|
||||
if 'x-delete-at' in req.headers:
|
||||
try:
|
||||
x_delete_at = int(req.headers['x-delete-at'])
|
||||
if x_delete_at < time.time():
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPBadRequest(body='X-Delete-At in past',
|
||||
request=req, content_type='text/plain')
|
||||
except ValueError:
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPBadRequest(request=req,
|
||||
content_type='text/plain',
|
||||
body='Non-integer X-Delete-At')
|
||||
@ -476,7 +458,6 @@ class ObjectController(Controller):
|
||||
headers.append(nheaders)
|
||||
resp = self.make_requests(req, self.app.object_ring, partition,
|
||||
'POST', req.path_info, headers)
|
||||
self.app.logger.timing_since('POST.timing', start_time)
|
||||
return resp
|
||||
|
||||
def _send_file(self, conn, path):
|
||||
@ -515,10 +496,8 @@ class ObjectController(Controller):
|
||||
|
||||
@public
|
||||
@delay_denial
|
||||
def PUT(self, req, start_time=None, stats_type='PUT'):
|
||||
def PUT(self, req):
|
||||
"""HTTP PUT request handler."""
|
||||
if not start_time:
|
||||
start_time = time.time()
|
||||
(container_partition, containers, _junk, req.acl,
|
||||
req.environ['swift_sync_key'], object_versions) = \
|
||||
self.container_info(self.account_name, self.container_name,
|
||||
@ -526,17 +505,13 @@ class ObjectController(Controller):
|
||||
if 'swift.authorize' in req.environ:
|
||||
aresp = req.environ['swift.authorize'](req)
|
||||
if aresp:
|
||||
self.app.logger.increment('auth_short_circuits')
|
||||
return aresp
|
||||
if not containers:
|
||||
self.app.logger.timing_since(
|
||||
'%s.timing' % (stats_type,), start_time)
|
||||
return HTTPNotFound(request=req)
|
||||
if 'x-delete-after' in req.headers:
|
||||
try:
|
||||
x_delete_after = int(req.headers['x-delete-after'])
|
||||
except ValueError:
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPBadRequest(request=req,
|
||||
content_type='text/plain',
|
||||
body='Non-integer X-Delete-After')
|
||||
@ -545,11 +520,9 @@ class ObjectController(Controller):
|
||||
try:
|
||||
x_delete_at = int(req.headers['x-delete-at'])
|
||||
if x_delete_at < time.time():
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPBadRequest(body='X-Delete-At in past',
|
||||
request=req, content_type='text/plain')
|
||||
except ValueError:
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPBadRequest(request=req, content_type='text/plain',
|
||||
body='Non-integer X-Delete-At')
|
||||
delete_at_container = str(x_delete_at /
|
||||
@ -577,11 +550,8 @@ class ObjectController(Controller):
|
||||
if hresp.environ and 'swift_x_timestamp' in hresp.environ and \
|
||||
float(hresp.environ['swift_x_timestamp']) >= \
|
||||
float(req.headers['x-timestamp']):
|
||||
self.app.logger.timing_since(
|
||||
'%.timing' % (stats_type,), start_time)
|
||||
return HTTPAccepted(request=req)
|
||||
except ValueError:
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPBadRequest(request=req, content_type='text/plain',
|
||||
body='X-Timestamp should be a UNIX timestamp float value; '
|
||||
'was %r' % req.headers['x-timestamp'])
|
||||
@ -596,7 +566,6 @@ class ObjectController(Controller):
|
||||
content_type_manually_set = False
|
||||
error_response = check_object_creation(req, self.object_name)
|
||||
if error_response:
|
||||
self.app.logger.increment('errors')
|
||||
return error_response
|
||||
if object_versions and not req.environ.get('swift_versioned_copy'):
|
||||
is_manifest = 'x-object-manifest' in req.headers or \
|
||||
@ -647,7 +616,6 @@ class ObjectController(Controller):
|
||||
src_container_name, src_obj_name = \
|
||||
source_header.split('/', 3)[2:]
|
||||
except ValueError:
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPPreconditionFailed(request=req,
|
||||
body='X-Copy-From header must be of the form'
|
||||
'<container name>/<object name>')
|
||||
@ -660,8 +628,6 @@ class ObjectController(Controller):
|
||||
self.container_name = src_container_name
|
||||
source_resp = self.GET(source_req)
|
||||
if source_resp.status_int >= HTTP_MULTIPLE_CHOICES:
|
||||
self.app.logger.timing_since(
|
||||
'%s.timing' % (stats_type,), start_time)
|
||||
return source_resp
|
||||
self.object_name = orig_obj_name
|
||||
self.container_name = orig_container_name
|
||||
@ -674,7 +640,6 @@ class ObjectController(Controller):
|
||||
# which currently only happens because there are more than
|
||||
# CONTAINER_LISTING_LIMIT segments in a segmented object. In
|
||||
# this case, we're going to refuse to do the server-side copy.
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPRequestEntityTooLarge(request=req)
|
||||
new_req.etag = source_resp.etag
|
||||
# we no longer need the X-Copy-From header
|
||||
@ -713,7 +678,6 @@ class ObjectController(Controller):
|
||||
_('Object PUT returning 503, %(conns)s/%(nodes)s '
|
||||
'required connections'),
|
||||
{'conns': len(conns), 'nodes': len(nodes) // 2 + 1})
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPServiceUnavailable(request=req)
|
||||
chunked = req.headers.get('transfer-encoding')
|
||||
bytes_transferred = 0
|
||||
@ -733,7 +697,6 @@ class ObjectController(Controller):
|
||||
break
|
||||
bytes_transferred += len(chunk)
|
||||
if bytes_transferred > MAX_FILE_SIZE:
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPRequestEntityTooLarge(request=req)
|
||||
for conn in list(conns):
|
||||
if not conn.failed:
|
||||
@ -745,7 +708,6 @@ class ObjectController(Controller):
|
||||
self.app.logger.error(_('Object PUT exceptions during'
|
||||
' send, %(conns)s/%(nodes)s required connections'),
|
||||
{'conns': len(conns), 'nodes': len(nodes) / 2 + 1})
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPServiceUnavailable(request=req)
|
||||
for conn in conns:
|
||||
if conn.queue.unfinished_tasks:
|
||||
@ -759,17 +721,12 @@ class ObjectController(Controller):
|
||||
except (Exception, Timeout):
|
||||
self.app.logger.exception(
|
||||
_('ERROR Exception causing client disconnect'))
|
||||
self.app.logger.increment('client_disconnects')
|
||||
self.app.logger.timing_since(
|
||||
'%s.timing' % (stats_type,), start_time)
|
||||
return HTTPClientDisconnect(request=req)
|
||||
if req.content_length and bytes_transferred < req.content_length:
|
||||
req.client_disconnect = True
|
||||
self.app.logger.warn(
|
||||
_('Client disconnected without sending enough data'))
|
||||
self.app.logger.increment('client_disconnects')
|
||||
self.app.logger.timing_since(
|
||||
'%s.timing' % (stats_type,), start_time)
|
||||
return HTTPClientDisconnect(request=req)
|
||||
statuses = []
|
||||
reasons = []
|
||||
@ -795,7 +752,6 @@ class ObjectController(Controller):
|
||||
if len(etags) > 1:
|
||||
self.app.logger.error(
|
||||
_('Object servers returned %s mismatched etags'), len(etags))
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPServerError(request=req)
|
||||
etag = len(etags) and etags.pop() or None
|
||||
while len(statuses) < len(nodes):
|
||||
@ -814,14 +770,12 @@ class ObjectController(Controller):
|
||||
if k.lower().startswith('x-object-meta-'):
|
||||
resp.headers[k] = v
|
||||
resp.last_modified = float(req.headers['X-Timestamp'])
|
||||
self.app.logger.timing_since('%s.timing' % (stats_type,), start_time)
|
||||
return resp
|
||||
|
||||
@public
|
||||
@delay_denial
|
||||
def DELETE(self, req):
|
||||
"""HTTP DELETE request handler."""
|
||||
start_time = time.time()
|
||||
(container_partition, containers, _junk, req.acl,
|
||||
req.environ['swift_sync_key'], object_versions) = \
|
||||
self.container_info(self.account_name, self.container_name)
|
||||
@ -839,10 +793,8 @@ class ObjectController(Controller):
|
||||
# no worries, last_item is None
|
||||
pass
|
||||
except ListingIterNotAuthorized, err:
|
||||
self.app.logger.increment('auth_short_circuits')
|
||||
return err.aresp
|
||||
except ListingIterError:
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPServerError(request=req)
|
||||
if last_item:
|
||||
# there are older versions so copy the previous version to the
|
||||
@ -880,10 +832,8 @@ class ObjectController(Controller):
|
||||
if 'swift.authorize' in req.environ:
|
||||
aresp = req.environ['swift.authorize'](req)
|
||||
if aresp:
|
||||
self.app.logger.increment('auth_short_circuits')
|
||||
return aresp
|
||||
if not containers:
|
||||
self.app.logger.timing_since('DELETE.timing', start_time)
|
||||
return HTTPNotFound(request=req)
|
||||
partition, nodes = self.app.object_ring.get_nodes(
|
||||
self.account_name, self.container_name, self.object_name)
|
||||
@ -893,7 +843,6 @@ class ObjectController(Controller):
|
||||
req.headers['X-Timestamp'] = \
|
||||
normalize_timestamp(float(req.headers['x-timestamp']))
|
||||
except ValueError:
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPBadRequest(request=req, content_type='text/plain',
|
||||
body='X-Timestamp should be a UNIX timestamp float value; '
|
||||
'was %r' % req.headers['x-timestamp'])
|
||||
@ -909,17 +858,14 @@ class ObjectController(Controller):
|
||||
headers.append(nheaders)
|
||||
resp = self.make_requests(req, self.app.object_ring,
|
||||
partition, 'DELETE', req.path_info, headers)
|
||||
self.app.logger.timing_since('DELETE.timing', start_time)
|
||||
return resp
|
||||
|
||||
@public
|
||||
@delay_denial
|
||||
def COPY(self, req):
|
||||
"""HTTP COPY request handler."""
|
||||
start_time = time.time()
|
||||
dest = req.headers.get('Destination')
|
||||
if not dest:
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPPreconditionFailed(request=req,
|
||||
body='Destination header required')
|
||||
dest = unquote(dest)
|
||||
@ -928,7 +874,6 @@ class ObjectController(Controller):
|
||||
try:
|
||||
_junk, dest_container, dest_object = dest.split('/', 2)
|
||||
except ValueError:
|
||||
self.app.logger.increment('errors')
|
||||
return HTTPPreconditionFailed(request=req,
|
||||
body='Destination header must be of the form '
|
||||
'<container name>/<object name>')
|
||||
@ -942,4 +887,4 @@ class ObjectController(Controller):
|
||||
req.headers['Content-Length'] = 0
|
||||
req.headers['X-Copy-From'] = quote(source)
|
||||
del req.headers['Destination']
|
||||
return self.PUT(req, start_time=start_time, stats_type='COPY')
|
||||
return self.PUT(req)
|
||||
|
@ -43,8 +43,8 @@ from swift.proxy.controllers import AccountController, ObjectController, \
|
||||
ContainerController, Controller
|
||||
|
||||
|
||||
class BaseApplication(object):
|
||||
"""Base WSGI application for the proxy server"""
|
||||
class Application(object):
|
||||
"""WSGI application for the proxy server."""
|
||||
|
||||
def __init__(self, conf, memcache=None, logger=None, account_ring=None,
|
||||
container_ring=None, object_ring=None):
|
||||
@ -52,16 +52,8 @@ class BaseApplication(object):
|
||||
conf = {}
|
||||
if logger is None:
|
||||
self.logger = get_logger(conf, log_route='proxy-server')
|
||||
access_log_conf = {}
|
||||
for key in ('log_facility', 'log_name', 'log_level',
|
||||
'log_udp_host', 'log_udp_port'):
|
||||
value = conf.get('access_' + key, conf.get(key, None))
|
||||
if value:
|
||||
access_log_conf[key] = value
|
||||
self.access_logger = get_logger(access_log_conf,
|
||||
log_route='proxy-access')
|
||||
else:
|
||||
self.logger = self.access_logger = logger
|
||||
self.logger = logger
|
||||
|
||||
swift_dir = conf.get('swift_dir', '/etc/swift')
|
||||
self.node_timeout = int(conf.get('node_timeout', 10))
|
||||
@ -70,7 +62,6 @@ class BaseApplication(object):
|
||||
self.put_queue_depth = int(conf.get('put_queue_depth', 10))
|
||||
self.object_chunk_size = int(conf.get('object_chunk_size', 65536))
|
||||
self.client_chunk_size = int(conf.get('client_chunk_size', 65536))
|
||||
self.log_headers = conf.get('log_headers', 'no').lower() in TRUE_VALUES
|
||||
self.error_suppression_interval = \
|
||||
int(conf.get('error_suppression_interval', 60))
|
||||
self.error_suppression_limit = \
|
||||
@ -202,7 +193,7 @@ class BaseApplication(object):
|
||||
return HTTPForbidden(request=req, body='Invalid host header')
|
||||
|
||||
self.logger.set_statsd_prefix('proxy-server.' +
|
||||
controller.server_type)
|
||||
controller.server_type.lower())
|
||||
controller = controller(self, **path_parts)
|
||||
if 'swift.trans_id' not in req.environ:
|
||||
# if this wasn't set by an earlier middleware, set it now
|
||||
@ -216,7 +207,6 @@ class BaseApplication(object):
|
||||
handler = getattr(controller, req.method)
|
||||
getattr(handler, 'publicly_accessible')
|
||||
except AttributeError:
|
||||
self.logger.increment('method_not_allowed')
|
||||
return HTTPMethodNotAllowed(request=req)
|
||||
if path_parts['version']:
|
||||
req.path_info_pop()
|
||||
@ -234,26 +224,17 @@ class BaseApplication(object):
|
||||
# Response indicates denial, but we might delay the denial
|
||||
# and recheck later. If not delayed, return the error now.
|
||||
if not getattr(handler, 'delay_denial', None):
|
||||
self.logger.increment('auth_short_circuits')
|
||||
return resp
|
||||
# Save off original request method (GET, POST, etc.) in case it
|
||||
# gets mutated during handling. This way logging can display the
|
||||
# method the client actually sent.
|
||||
req.environ['swift.orig_req_method'] = req.method
|
||||
return handler(req)
|
||||
except (Exception, Timeout):
|
||||
self.logger.exception(_('ERROR Unhandled exception in request'))
|
||||
return HTTPServerError(request=req)
|
||||
|
||||
|
||||
class Application(BaseApplication):
|
||||
"""WSGI application for the proxy server."""
|
||||
|
||||
def handle_request(self, req):
|
||||
"""
|
||||
Wraps the BaseApplication.handle_request and logs the request.
|
||||
"""
|
||||
req.start_time = time.time()
|
||||
req.response = super(Application, self).handle_request(req)
|
||||
return req.response
|
||||
|
||||
|
||||
def app_factory(global_conf, **local_conf):
|
||||
"""paste.deploy app factory for creating WSGI proxy apps."""
|
||||
conf = global_conf.copy()
|
||||
|
@ -16,9 +16,12 @@
|
||||
import unittest
|
||||
from urllib import quote, unquote
|
||||
import cStringIO as StringIO
|
||||
from logging.handlers import SysLogHandler
|
||||
|
||||
from webob import Request
|
||||
|
||||
from test.unit import FakeLogger
|
||||
from swift.common.utils import get_logger
|
||||
from swift.common.middleware import proxy_logging
|
||||
|
||||
|
||||
@ -64,27 +67,129 @@ class FakeAppReadline(object):
|
||||
return ["FAKE APP"]
|
||||
|
||||
|
||||
class FakeLogger(object):
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.msg = ''
|
||||
|
||||
def info(self, string):
|
||||
self.msg = string
|
||||
|
||||
|
||||
def start_response(*args):
|
||||
pass
|
||||
|
||||
|
||||
class TestProxyLogging(unittest.TestCase):
|
||||
|
||||
def _log_parts(self, app):
|
||||
info_calls = app.access_logger.log_dict['info']
|
||||
self.assertEquals(1, len(info_calls))
|
||||
return info_calls[0][0][0].split(' ')
|
||||
|
||||
def assertTiming(self, exp_metric, app, exp_timing=None):
|
||||
timing_calls = app.access_logger.log_dict['timing']
|
||||
self.assertEquals(1, len(timing_calls))
|
||||
self.assertEquals({}, timing_calls[0][1])
|
||||
self.assertEquals(2, len(timing_calls[0][0]))
|
||||
self.assertEquals(exp_metric, timing_calls[0][0][0])
|
||||
if exp_timing is not None:
|
||||
self.assertEquals(exp_timing, timing_calls[0][0][1])
|
||||
|
||||
def assertUpdateStats(self, exp_metric, exp_bytes, app):
|
||||
update_stats_calls = app.access_logger.log_dict['update_stats']
|
||||
self.assertEquals(1, len(update_stats_calls))
|
||||
self.assertEquals({}, update_stats_calls[0][1])
|
||||
self.assertEquals((exp_metric, exp_bytes), update_stats_calls[0][0])
|
||||
|
||||
def test_log_request_statsd_invalid_stats_types(self):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeApp(), {})
|
||||
app.access_logger = FakeLogger()
|
||||
for url in ['/', '/foo', '/foo/bar', '/v1']:
|
||||
req = Request.blank(url, environ={'REQUEST_METHOD': 'GET'})
|
||||
resp = app(req.environ, start_response)
|
||||
resp_body = ''.join(resp)
|
||||
self.assertEquals(0, len(app.access_logger.log_dict['timing']))
|
||||
self.assertEquals(0,
|
||||
len(app.access_logger.log_dict['update_stats']))
|
||||
|
||||
def test_log_request_stat_type_bad(self):
|
||||
for bad_path in ['', '/', '/bad', '/baddy/mc_badderson', '/v1',
|
||||
'/v1/']:
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeApp(), {})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank(bad_path, environ={'REQUEST_METHOD': 'GET'})
|
||||
app.log_request(req.environ, 123, 7, 13, 2.71828182846, False)
|
||||
self.assertEqual([], app.access_logger.log_dict['timing'])
|
||||
self.assertEqual([], app.access_logger.log_dict['update_stats'])
|
||||
|
||||
def test_log_request_stat_type_good(self):
|
||||
path_types = {
|
||||
'/v1/a': 'account',
|
||||
'/v1/a/': 'account',
|
||||
'/v1/a/c': 'container',
|
||||
'/v1/a/c/': 'container',
|
||||
'/v1/a/c/o': 'object',
|
||||
'/v1/a/c/o/': 'object',
|
||||
'/v1/a/c/o/p': 'object',
|
||||
'/v1/a/c/o/p/': 'object',
|
||||
'/v1/a/c/o/p/p2': 'object',
|
||||
}
|
||||
for path, exp_type in path_types.iteritems():
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeApp(), {})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank(path, environ={'REQUEST_METHOD': 'GET'})
|
||||
app.log_request(req.environ, 321, 7, 13, 2.71828182846, False)
|
||||
self.assertTiming('%s.GET.321.timing' % exp_type, app,
|
||||
exp_timing=2.71828182846 * 1000)
|
||||
self.assertUpdateStats('%s.GET.321.xfer' % exp_type, 7 + 13, app)
|
||||
|
||||
def test_log_request_stat_method_filtering_default(self):
|
||||
method_map = {
|
||||
'foo': 'BAD_METHOD',
|
||||
'': 'BAD_METHOD',
|
||||
'PUTT': 'BAD_METHOD',
|
||||
'SPECIAL': 'BAD_METHOD',
|
||||
'GET': 'GET',
|
||||
'PUT': 'PUT',
|
||||
'COPY': 'COPY',
|
||||
'HEAD': 'HEAD',
|
||||
'POST': 'POST',
|
||||
'DELETE': 'DELETE',
|
||||
}
|
||||
for method, exp_method in method_map.iteritems():
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeApp(), {})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank('/v1/a/', environ={'REQUEST_METHOD': method})
|
||||
app.log_request(req.environ, 299, 11, 3, 1.17, False)
|
||||
self.assertTiming('account.%s.299.timing' % exp_method, app,
|
||||
exp_timing=1.17 * 1000)
|
||||
self.assertUpdateStats('account.%s.299.xfer' % exp_method,
|
||||
11 + 3, app)
|
||||
|
||||
def test_log_request_stat_method_filtering_custom(self):
|
||||
method_map = {
|
||||
'foo': 'BAD_METHOD',
|
||||
'': 'BAD_METHOD',
|
||||
'PUTT': 'BAD_METHOD',
|
||||
'SPECIAL': 'SPECIAL', # will be configured
|
||||
'GET': 'GET',
|
||||
'PUT': 'PUT',
|
||||
'COPY': 'BAD_METHOD', # prove no one's special
|
||||
}
|
||||
# this conf var supports optional leading access_
|
||||
for conf_key in ['access_log_statsd_valid_http_methods',
|
||||
'log_statsd_valid_http_methods']:
|
||||
for method, exp_method in method_map.iteritems():
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeApp(), {
|
||||
conf_key: 'SPECIAL, GET,PUT ', # crazy spaces ok
|
||||
})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank('/v1/a/c', environ={'REQUEST_METHOD': method})
|
||||
app.log_request(req.environ, 911, 4, 43, 1.01, False)
|
||||
self.assertTiming('container.%s.911.timing' % exp_method, app,
|
||||
exp_timing=1.01 * 1000)
|
||||
self.assertUpdateStats('container.%s.911.xfer' % exp_method,
|
||||
4 + 43, app)
|
||||
|
||||
def test_basic_req(self):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeApp(), {})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank('/', environ={'REQUEST_METHOD': 'GET'})
|
||||
resp = app(req.environ, start_response)
|
||||
resp_body = ''.join(resp)
|
||||
log_parts = app.access_logger.msg.split()
|
||||
log_parts = self._log_parts(app)
|
||||
self.assertEquals(log_parts[3], 'GET')
|
||||
self.assertEquals(log_parts[4], '/')
|
||||
self.assertEquals(log_parts[5], 'HTTP/1.0')
|
||||
@ -96,16 +201,18 @@ class TestProxyLogging(unittest.TestCase):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeApp(
|
||||
['some', 'chunks', 'of data']), {})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank('/', environ={'REQUEST_METHOD': 'GET'})
|
||||
req = Request.blank('/', environ={'REQUEST_METHOD': 'GET',
|
||||
'swift.source': 'SOS'})
|
||||
resp = app(req.environ, start_response)
|
||||
resp_body = ''.join(resp)
|
||||
log_parts = app.access_logger.msg.split()
|
||||
log_parts = self._log_parts(app)
|
||||
self.assertEquals(log_parts[3], 'GET')
|
||||
self.assertEquals(log_parts[4], '/')
|
||||
self.assertEquals(log_parts[5], 'HTTP/1.0')
|
||||
self.assertEquals(log_parts[6], '200')
|
||||
self.assertEquals(resp_body, 'somechunksof data')
|
||||
self.assertEquals(log_parts[11], str(len(resp_body)))
|
||||
self.assertUpdateStats('SOS.GET.200.xfer', len(resp_body), app)
|
||||
|
||||
def test_log_headers(self):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeApp(),
|
||||
@ -114,7 +221,7 @@ class TestProxyLogging(unittest.TestCase):
|
||||
req = Request.blank('/', environ={'REQUEST_METHOD': 'GET'})
|
||||
resp = app(req.environ, start_response)
|
||||
exhaust_generator = [x for x in resp]
|
||||
log_parts = app.access_logger.msg.split()
|
||||
log_parts = self._log_parts(app)
|
||||
headers = unquote(log_parts[14]).split('\n')
|
||||
self.assert_('Host: localhost:80' in headers)
|
||||
|
||||
@ -122,24 +229,32 @@ class TestProxyLogging(unittest.TestCase):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeApp(),
|
||||
{'log_headers': 'yes'})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank('/', environ={'REQUEST_METHOD': 'GET',
|
||||
req = Request.blank('/v1/a/c/o/foo', environ={'REQUEST_METHOD': 'PUT',
|
||||
'wsgi.input': StringIO.StringIO('some stuff')})
|
||||
resp = app(req.environ, start_response)
|
||||
exhaust_generator = [x for x in resp]
|
||||
log_parts = app.access_logger.msg.split()
|
||||
log_parts = self._log_parts(app)
|
||||
self.assertEquals(log_parts[11], str(len('FAKE APP')))
|
||||
self.assertEquals(log_parts[10], str(len('some stuff')))
|
||||
self.assertUpdateStats('object.PUT.200.xfer',
|
||||
len('some stuff') + len('FAKE APP'),
|
||||
app)
|
||||
|
||||
def test_upload_line(self):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeAppReadline(),
|
||||
{'log_headers': 'yes'})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank('/', environ={'REQUEST_METHOD': 'GET',
|
||||
req = Request.blank('/v1/a/c', environ={'REQUEST_METHOD': 'POST',
|
||||
'wsgi.input': StringIO.StringIO(
|
||||
'some stuff\nsome other stuff\n')})
|
||||
resp = app(req.environ, start_response)
|
||||
exhaust_generator = ''.join(resp)
|
||||
log_parts = app.access_logger.msg.split()
|
||||
exhaust_generator = [x for x in resp]
|
||||
log_parts = self._log_parts(app)
|
||||
self.assertEquals(log_parts[11], str(len('FAKE APP')))
|
||||
self.assertEquals(log_parts[10], str(len('some stuff\n')))
|
||||
self.assertUpdateStats('container.POST.200.xfer',
|
||||
len('some stuff\n') + len('FAKE APP'),
|
||||
app)
|
||||
|
||||
def test_log_query_string(self):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeApp(), {})
|
||||
@ -148,7 +263,7 @@ class TestProxyLogging(unittest.TestCase):
|
||||
'QUERY_STRING': 'x=3'})
|
||||
resp = app(req.environ, start_response)
|
||||
exhaust_generator = [x for x in resp]
|
||||
log_parts = app.access_logger.msg.split()
|
||||
log_parts = self._log_parts(app)
|
||||
self.assertEquals(unquote(log_parts[4]), '/?x=3')
|
||||
|
||||
def test_client_logging(self):
|
||||
@ -158,38 +273,41 @@ class TestProxyLogging(unittest.TestCase):
|
||||
'REMOTE_ADDR': '1.2.3.4'})
|
||||
resp = app(req.environ, start_response)
|
||||
exhaust_generator = [x for x in resp]
|
||||
log_parts = app.access_logger.msg.split()
|
||||
log_parts = self._log_parts(app)
|
||||
self.assertEquals(log_parts[0], '1.2.3.4') # client ip
|
||||
self.assertEquals(log_parts[1], '1.2.3.4') # remote addr
|
||||
|
||||
def test_proxy_client_logging(self):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeApp(), {})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank('/', environ={'REQUEST_METHOD': 'GET',
|
||||
'REMOTE_ADDR': '1.2.3.4',
|
||||
'HTTP_X_FORWARDED_FOR': '4.5.6.7,8.9.10.11'
|
||||
})
|
||||
req = Request.blank('/', environ={
|
||||
'REQUEST_METHOD': 'GET',
|
||||
'REMOTE_ADDR': '1.2.3.4',
|
||||
'HTTP_X_FORWARDED_FOR': '4.5.6.7,8.9.10.11'})
|
||||
resp = app(req.environ, start_response)
|
||||
exhaust_generator = [x for x in resp]
|
||||
log_parts = app.access_logger.msg.split()
|
||||
log_parts = self._log_parts(app)
|
||||
self.assertEquals(log_parts[0], '4.5.6.7') # client ip
|
||||
self.assertEquals(log_parts[1], '1.2.3.4') # remote addr
|
||||
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeApp(), {})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank('/', environ={'REQUEST_METHOD': 'GET',
|
||||
'REMOTE_ADDR': '1.2.3.4',
|
||||
'HTTP_X_CLUSTER_CLIENT_IP': '4.5.6.7'
|
||||
})
|
||||
req = Request.blank('/', environ={
|
||||
'REQUEST_METHOD': 'GET',
|
||||
'REMOTE_ADDR': '1.2.3.4',
|
||||
'HTTP_X_CLUSTER_CLIENT_IP': '4.5.6.7'})
|
||||
resp = app(req.environ, start_response)
|
||||
exhaust_generator = [x for x in resp]
|
||||
log_parts = app.access_logger.msg.split()
|
||||
log_parts = self._log_parts(app)
|
||||
self.assertEquals(log_parts[0], '4.5.6.7') # client ip
|
||||
self.assertEquals(log_parts[1], '1.2.3.4') # remote addr
|
||||
|
||||
def test_facility(self):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeApp(),
|
||||
{'log_headers': 'yes', 'access_log_facility': 'whatever'})
|
||||
{'log_headers': 'yes',
|
||||
'access_log_facility': 'LOG_LOCAL7'})
|
||||
handler = get_logger.handler4logger[app.access_logger.logger]
|
||||
self.assertEquals(SysLogHandler.LOG_LOCAL7, handler.facility)
|
||||
|
||||
def test_filter(self):
|
||||
factory = proxy_logging.filter_factory({})
|
||||
@ -198,13 +316,13 @@ class TestProxyLogging(unittest.TestCase):
|
||||
|
||||
def test_unread_body(self):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(
|
||||
FakeApp(['some', 'stuff']), {})
|
||||
FakeApp(['some', 'stuff']), {})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank('/', environ={'REQUEST_METHOD': 'GET'})
|
||||
resp = app(req.environ, start_response)
|
||||
read_first_chunk = next(resp)
|
||||
resp.close() # raise a GeneratorExit in middleware app_iter loop
|
||||
log_parts = app.access_logger.msg.split()
|
||||
log_parts = self._log_parts(app)
|
||||
self.assertEquals(log_parts[6], '499')
|
||||
self.assertEquals(log_parts[11], '4') # write length
|
||||
|
||||
@ -218,13 +336,13 @@ class TestProxyLogging(unittest.TestCase):
|
||||
body = ''.join(resp)
|
||||
except Exception:
|
||||
pass
|
||||
log_parts = app.access_logger.msg.split()
|
||||
log_parts = self._log_parts(app)
|
||||
self.assertEquals(log_parts[6], '499')
|
||||
self.assertEquals(log_parts[10], '-') # read length
|
||||
|
||||
def test_disconnect_on_read(self):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(
|
||||
FakeApp(['some', 'stuff']), {})
|
||||
FakeApp(['some', 'stuff']), {})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank('/', environ={'REQUEST_METHOD': 'GET',
|
||||
'wsgi.input': FileLikeExceptor()})
|
||||
@ -233,17 +351,64 @@ class TestProxyLogging(unittest.TestCase):
|
||||
body = ''.join(resp)
|
||||
except Exception:
|
||||
pass
|
||||
log_parts = app.access_logger.msg.split()
|
||||
log_parts = self._log_parts(app)
|
||||
self.assertEquals(log_parts[6], '499')
|
||||
self.assertEquals(log_parts[10], '-') # read length
|
||||
|
||||
def test_no_content_length_no_transfer_encoding(self):
|
||||
def test_no_content_length_no_transfer_encoding_with_list_body(self):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(
|
||||
FakeAppNoContentLengthNoTransferEncoding(), {})
|
||||
FakeAppNoContentLengthNoTransferEncoding(
|
||||
# test the "while not chunk: chunk = iterator.next()"
|
||||
body=['', '', 'line1\n', 'line2\n'],
|
||||
), {})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank('/', environ={'REQUEST_METHOD': 'GET'})
|
||||
resp = app(req.environ, start_response)
|
||||
body = ''.join(resp)
|
||||
resp_body = ''.join(resp)
|
||||
log_parts = self._log_parts(app)
|
||||
self.assertEquals(log_parts[3], 'GET')
|
||||
self.assertEquals(log_parts[4], '/')
|
||||
self.assertEquals(log_parts[5], 'HTTP/1.0')
|
||||
self.assertEquals(log_parts[6], '200')
|
||||
self.assertEquals(resp_body, 'line1\nline2\n')
|
||||
self.assertEquals(log_parts[11], str(len(resp_body)))
|
||||
|
||||
def test_no_content_length_no_transfer_encoding_with_empty_strings(self):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(
|
||||
FakeAppNoContentLengthNoTransferEncoding(
|
||||
# test the "while not chunk: chunk = iterator.next()"
|
||||
body=['', '', ''],
|
||||
), {})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank('/', environ={'REQUEST_METHOD': 'GET'})
|
||||
resp = app(req.environ, start_response)
|
||||
resp_body = ''.join(resp)
|
||||
log_parts = self._log_parts(app)
|
||||
self.assertEquals(log_parts[3], 'GET')
|
||||
self.assertEquals(log_parts[4], '/')
|
||||
self.assertEquals(log_parts[5], 'HTTP/1.0')
|
||||
self.assertEquals(log_parts[6], '200')
|
||||
self.assertEquals(resp_body, '')
|
||||
self.assertEquals(log_parts[11], '-')
|
||||
|
||||
def test_no_content_length_no_transfer_encoding_with_str_body(self):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(
|
||||
FakeAppNoContentLengthNoTransferEncoding(
|
||||
body='line1\nline2\n',
|
||||
), {})
|
||||
app.access_logger = FakeLogger()
|
||||
req = Request.blank('/', environ={'REQUEST_METHOD': 'GET'})
|
||||
resp = app(req.environ, start_response)
|
||||
# Python 2.7 can have assertRaises act as a context manager, but python
|
||||
# 2.6 can't. So there's this.
|
||||
try:
|
||||
resp_body = ''.join(resp)
|
||||
except Exception as e:
|
||||
self.assertEquals(
|
||||
"WSGI [proxy-logging]: No content-length or transfer-encoding "
|
||||
"header sent and there is content! 'l'", str(e))
|
||||
else:
|
||||
self.assert_(False)
|
||||
|
||||
def test_req_path_info_popping(self):
|
||||
app = proxy_logging.ProxyLoggingMiddleware(FakeApp(), {})
|
||||
@ -253,7 +418,7 @@ class TestProxyLogging(unittest.TestCase):
|
||||
self.assertEquals(req.environ['PATH_INFO'], '/something')
|
||||
resp = app(req.environ, start_response)
|
||||
resp_body = ''.join(resp)
|
||||
log_parts = app.access_logger.msg.split()
|
||||
log_parts = self._log_parts(app)
|
||||
self.assertEquals(log_parts[3], 'GET')
|
||||
self.assertEquals(log_parts[4], '/v1/something')
|
||||
self.assertEquals(log_parts[5], 'HTTP/1.0')
|
||||
|
@ -597,7 +597,7 @@ class TestProxyServer(unittest.TestCase):
|
||||
self.assertEquals(resp.status_int, 500)
|
||||
|
||||
def test_internal_method_request(self):
|
||||
baseapp = proxy_server.BaseApplication({},
|
||||
baseapp = proxy_server.Application({},
|
||||
FakeMemcache(), container_ring=FakeRing(), object_ring=FakeRing(),
|
||||
account_ring=FakeRing())
|
||||
resp = baseapp.handle_request(
|
||||
@ -605,7 +605,7 @@ class TestProxyServer(unittest.TestCase):
|
||||
self.assertEquals(resp.status, '405 Method Not Allowed')
|
||||
|
||||
def test_inexistent_method_request(self):
|
||||
baseapp = proxy_server.BaseApplication({},
|
||||
baseapp = proxy_server.Application({},
|
||||
FakeMemcache(), container_ring=FakeRing(), account_ring=FakeRing(),
|
||||
object_ring=FakeRing())
|
||||
resp = baseapp.handle_request(
|
||||
@ -646,7 +646,7 @@ class TestProxyServer(unittest.TestCase):
|
||||
def test_negative_content_length(self):
|
||||
swift_dir = mkdtemp()
|
||||
try:
|
||||
baseapp = proxy_server.BaseApplication({'swift_dir': swift_dir},
|
||||
baseapp = proxy_server.Application({'swift_dir': swift_dir},
|
||||
FakeMemcache(), FakeLogger(), FakeRing(), FakeRing(),
|
||||
FakeRing())
|
||||
resp = baseapp.handle_request(
|
||||
@ -663,7 +663,7 @@ class TestProxyServer(unittest.TestCase):
|
||||
def test_denied_host_header(self):
|
||||
swift_dir = mkdtemp()
|
||||
try:
|
||||
baseapp = proxy_server.BaseApplication({'swift_dir': swift_dir,
|
||||
baseapp = proxy_server.Application({'swift_dir': swift_dir,
|
||||
'deny_host_headers': 'invalid_host.com'},
|
||||
FakeMemcache(), FakeLogger(), FakeRing(), FakeRing(),
|
||||
FakeRing())
|
||||
@ -774,7 +774,7 @@ class TestObjectController(unittest.TestCase):
|
||||
try:
|
||||
with open(os.path.join(swift_dir, 'mime.types'), 'w') as fp:
|
||||
fp.write('foo/bar foo\n')
|
||||
ba = proxy_server.BaseApplication({'swift_dir': swift_dir},
|
||||
ba = proxy_server.Application({'swift_dir': swift_dir},
|
||||
FakeMemcache(), FakeLogger(), FakeRing(), FakeRing(),
|
||||
FakeRing())
|
||||
self.assertEquals(proxy_server.mimetypes.guess_type('blah.foo')[0],
|
||||
|
Loading…
Reference in New Issue
Block a user