Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
slapos
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Levin Zimmermann
slapos
Commits
60918e63
Commit
60918e63
authored
Jul 15, 2022
by
Jérome Perrin
Browse files
Options
Browse Files
Download
Plain Diff
Merge remote-tracking branch 'upstream/master' into zope4py2
parents
59843d01
f7b39cc0
Changes
14
Show whitespace changes
Inline
Side-by-side
Showing
14 changed files
with
422 additions
and
381 deletions
+422
-381
component/qemu-kvm/buildout.cfg
component/qemu-kvm/buildout.cfg
+2
-1
software/caddy-frontend/buildout.hash.cfg
software/caddy-frontend/buildout.hash.cfg
+8
-8
software/caddy-frontend/caddyprofiledummy.py
software/caddy-frontend/caddyprofiledummy.py
+7
-7
software/caddy-frontend/instance-apache-frontend.cfg.in
software/caddy-frontend/instance-apache-frontend.cfg.in
+13
-8
software/caddy-frontend/instance-apache-replicate.cfg.in
software/caddy-frontend/instance-apache-replicate.cfg.in
+12
-12
software/caddy-frontend/instance-kedifa.cfg.in
software/caddy-frontend/instance-kedifa.cfg.in
+19
-9
software/caddy-frontend/instance.cfg.in
software/caddy-frontend/instance.cfg.in
+2
-1
software/caddy-frontend/software.cfg
software/caddy-frontend/software.cfg
+3
-1
software/caddy-frontend/templates/apache-custom-slave-list.cfg.in
.../caddy-frontend/templates/apache-custom-slave-list.cfg.in
+16
-11
software/caddy-frontend/templates/replicate-publish-slave-information.cfg.in
...tend/templates/replicate-publish-slave-information.cfg.in
+7
-7
software/caddy-frontend/templates/slave-introspection-httpd-nginx.conf.in
...rontend/templates/slave-introspection-httpd-nginx.conf.in
+1
-1
software/caddy-frontend/test/test.py
software/caddy-frontend/test/test.py
+331
-314
software/slapos-sr-testing/software-py3.cfg
software/slapos-sr-testing/software-py3.cfg
+1
-0
software/slapos-sr-testing/software.cfg
software/slapos-sr-testing/software.cfg
+0
-1
No files found.
component/qemu-kvm/buildout.cfg
View file @
60918e63
...
...
@@ -32,6 +32,7 @@ md5sum = bfb5b09a0d1f887c8c42a6d5f26971ab
patches =
https://gitlab.com/redhat/centos-stream/src/qemu-kvm/-/merge_requests/87.diff#ad41b138aa6f330f95811c9a83637b85
patch-options = -p1
patch-binary = ${patch:location}/bin/patch
pre-configure =
sed -i '/^libmigration\b/s/$/ dependencies: [zlib],/' meson.build
sed -i 's/\bsnappy,/zlib, \0/' dump/meson.build
...
...
@@ -59,7 +60,7 @@ configure-options =
environment =
CFLAGS=-I${librbd:location}/include/ -I${gettext:location}/include -I${libaio:location}/include -I${liburing:location}/include -I${libcap-ng:location}/include
LDFLAGS=-L${librbd:location}/lib -Wl,-rpath=${librbd:location}/lib -L${gettext:location}/lib -L${libaio:location}/lib -L${libcap-ng:location}/lib -Wl,-rpath=${libcap-ng:location}/lib -Wl,-rpath=${glib:location}/lib -Wl,-rpath=${gnutls:location}/lib -Wl,-rpath=${nettle:location}/lib -Wl,-rpath=${pixman:location}/lib -Wl,-rpath=${zlib:location}/lib -Wl,-rpath=${gettext:location}/lib -Wl,-rpath=${libpng:location}/lib -Wl,-rpath=${libaio:location}/lib -Wl,-rpath=${liburing:location}/lib -Wl,-rpath=${libcap-ng:location}/lib
PATH=${p
atch:location}/bin:${p
kgconfig:location}/bin:${bzip2:location}/bin:%(PATH)s
PATH=${pkgconfig:location}/bin:${bzip2:location}/bin:%(PATH)s
PKG_CONFIG_PATH=${glib:location}/lib/pkgconfig:${gnutls:location}/lib/pkgconfig:${gnutls:pkg-config-path}:${libpng:location}/lib/pkgconfig:${liburing:location}/lib/pkgconfig:${ncurses:location}/lib/pkgconfig:${pcre:location}/lib/pkgconfig:${pixman:location}/lib/pkgconfig:${librbd:location}/lib/pkgconfig
[qemu:sys.version_info < (3,6)]
...
...
software/caddy-frontend/buildout.hash.cfg
View file @
60918e63
...
...
@@ -14,7 +14,7 @@
# not need these here).
[template]
filename = instance.cfg.in
md5sum =
051ae51b86f9aba169a6777fa2239901
md5sum =
f1f04e7f27bc6e40a655dd4badb2a8af
[profile-common]
filename = instance-common.cfg.in
...
...
@@ -22,19 +22,19 @@ md5sum = 5784bea3bd608913769ff9a8afcccb68
[profile-caddy-frontend]
filename = instance-apache-frontend.cfg.in
md5sum =
1e912fb970401a4b7670b25ba8284a5b
md5sum =
874133120f3a4eda1d0505b8608b280f
[profile-caddy-replicate]
filename = instance-apache-replicate.cfg.in
md5sum =
57388e76c7e61b3d7213df8aac0b407d
md5sum =
02a10d92d2b0e270454998cf865b6895
[profile-slave-list]
_update_hash_filename_ = templates/apache-custom-slave-list.cfg.in
md5sum =
964a7f673f441f3a3e90c88ab03e3351
md5sum =
268a945e5c7a52c8766b54a817215c4c
[profile-replicate-publish-slave-information]
_update_hash_filename_ = templates/replicate-publish-slave-information.cfg.in
md5sum = b
e54431846fe7f3cee65260eefc83d62
md5sum = b
3422f3624054f57b78d0e50a0de399a
[profile-caddy-frontend-configuration]
_update_hash_filename_ = templates/Caddyfile.in
...
...
@@ -98,11 +98,11 @@ md5sum = f6f72d03af7d9dc29fb4d4fef1062e73
[caddyprofiledeps-dummy]
filename = caddyprofiledummy.py
md5sum =
b41b8de115ad815d0b0db306ad650365
md5sum =
1c866272ec0ea0c161f0c0d80cb6e584
[profile-kedifa]
filename = instance-kedifa.cfg.in
md5sum =
b5426129668f39ace55f14012c4a2fd2
md5sum =
2f1c9cc9a3d2f4c6ac59eba5a99d4983
[template-backend-haproxy-rsyslogd-conf]
_update_hash_filename_ = templates/backend-haproxy-rsyslogd.conf.in
...
...
@@ -110,7 +110,7 @@ md5sum = 3336d554661b138dcef97b1d1866803c
[template-slave-introspection-httpd-nginx]
_update_hash_filename_ = templates/slave-introspection-httpd-nginx.conf.in
md5sum =
3067e6ba6c6901821d57d2109517d39c
md5sum =
b79addf01b6fb93c2f3d018e83eff766
[template-expose-csr-nginx-conf]
_update_hash_filename_ = templates/expose-csr-nginx.conf.in
...
...
software/caddy-frontend/caddyprofiledummy.py
View file @
60918e63
from
__future__
import
print_function
import
caucase.client
import
caucase.utils
import
os
import
ssl
import
sys
import
urllib
import
urlparse
import
urllib
.request
,
urllib
.
parse
,
urllib
.
error
import
url
lib.
parse
from
cryptography
import
x509
from
cryptography.hazmat.primitives
import
serialization
...
...
@@ -24,7 +24,7 @@ class Recipe(object):
def
validate_netloc
(
netloc
):
# a bit crazy way to validate that the passed parameter is haproxy
# compatible server netloc
parsed
=
urlparse
.
urlparse
(
'scheme://'
+
netloc
)
parsed
=
url
lib
.
parse
.
urlparse
(
'scheme://'
+
netloc
)
if
':'
in
parsed
.
hostname
:
hostname
=
'[%s]'
%
parsed
.
hostname
else
:
...
...
@@ -33,7 +33,7 @@ def validate_netloc(netloc):
def
_check_certificate
(
url
,
certificate
):
parsed
=
urlparse
.
urlparse
(
url
)
parsed
=
url
lib
.
parse
.
urlparse
(
url
)
got_certificate
=
ssl
.
get_server_certificate
((
parsed
.
hostname
,
parsed
.
port
))
if
certificate
.
strip
()
!=
got_certificate
.
strip
():
raise
ValueError
(
'Certificate for %s does not match expected one'
%
(
url
,))
...
...
@@ -44,7 +44,7 @@ def _get_exposed_csr(url, certificate):
self_signed
=
ssl
.
create_default_context
()
self_signed
.
check_hostname
=
False
self_signed
.
verify_mode
=
ssl
.
CERT_NONE
return
urllib
.
urlopen
(
url
,
context
=
self_signed
).
read
()
return
urllib
.
request
.
urlopen
(
url
,
context
=
self_signed
).
read
().
decode
()
def
_get_caucase_client
(
ca_url
,
ca_crt
,
user_key
):
...
...
@@ -72,7 +72,7 @@ def _csr_match(*csr_list):
number_list
=
set
([])
for
csr
in
csr_list
:
number_list
.
add
(
x509
.
load_pem_x509_csr
(
str
(
csr
)).
public_key
().
public_numbers
())
x509
.
load_pem_x509_csr
(
csr
.
encode
(
)).
public_key
().
public_numbers
())
return
len
(
number_list
)
==
1
...
...
software/caddy-frontend/instance-apache-frontend.cfg.in
View file @
60918e63
...
...
@@ -99,7 +99,7 @@ hash-salt = ${frontend-node-private-salt:value}
init =
import hashlib
import base64
options['value'] = base64.urlsafe_b64encode(hashlib.md5(''.join([options['software-release-url'].strip(), options['hash-salt']])
).digest()
)
options['value'] = base64.urlsafe_b64encode(hashlib.md5(''.join([options['software-release-url'].strip(), options['hash-salt']])
.encode()).digest()).decode(
)
[frontend-node-information]
recipe = slapos.recipe.build
...
...
@@ -359,9 +359,9 @@ partition_ipv6 = ${slap-configuration:ipv6-random}
extra-context =
key caddy_configuration_directory caddy-directory:slave-configuration
key backend_client_caucase_url :backend-client-caucase-url
import urlparse_module urlparse
import furl_module furl
import urllib_module urllib
import operator_module operator
key master_key_download_url :master_key_download_url
key autocert caddy-directory:autocert
key caddy_log_directory caddy-directory:slave-log
...
...
@@ -475,9 +475,14 @@ slave-introspection-graceful-command = ${slave-introspection-validate:output} &&
# BBB: SlapOS Master non-zero knowledge BEGIN
[get-self-signed-fallback-access]
recipe = collective.recipe.shelloutput
commands =
certificate = cat ${self-signed-fallback-access:certificate}
recipe = slapos.recipe.build
certificate-file = ${self-signed-fallback-access:certificate}
init =
import os
options['certificate'] = ''
if os.path.exists(options['certificate-file']):
with open(options['certificate-file'], 'r') as fh:
options['certificate'] = fh.read()
[apache-certificate]
recipe = slapos.recipe.template:jinja2
...
...
@@ -1066,7 +1071,7 @@ config-command =
${logrotate:wrapper-path} -d
[configuration]
{%- for key, value in instance_parameter_dict.ite
rite
ms() -%}
{%- for key, value in instance_parameter_dict.items() -%}
{%- if key.startswith('configuration.') %}
{{ key.replace('configuration.', '') }} = {{ dumps(value) }}
{%- endif -%}
...
...
@@ -1076,13 +1081,13 @@ config-command =
{#- There are dangerous keys like recipe, etc #}
{#- XXX: Some other approach would be useful #}
{%- set DROP_KEY_LIST = ['recipe', '__buildout_signature__', 'computer', 'partition', 'url', 'key', 'cert'] %}
{%- for key, value in instance_parameter_dict.ite
rite
ms() -%}
{%- for key, value in instance_parameter_dict.items() -%}
{%- if not key.startswith('configuration.') and key not in DROP_KEY_LIST %}
{{ key }} = {{ dumps(value) }}
{%- endif -%}
{%- endfor %}
[software-parameter-section]
{%- for key, value in software_parameter_dict.ite
rite
ms() %}
{%- for key, value in software_parameter_dict.items() %}
{{ key }} = {{ dumps(value) }}
{%- endfor %}
software/caddy-frontend/instance-apache-replicate.cfg.in
View file @
60918e63
...
...
@@ -129,7 +129,7 @@ context =
{% set config_key = "-frontend-config-%s-" % i %}
{% set config_key_length = config_key | length %}
{% set config_dict = {} %}
{% for key in
slapparameter_dict.keys(
) %}
{% for key in
list(slapparameter_dict.keys()
) %}
{% if key.startswith(sla_key) %}
{% do sla_dict.__setitem__(key[sla_key_length:], slapparameter_dict.pop(key)) %}
# We check for specific configuration regarding the frontend
...
...
@@ -164,7 +164,7 @@ context =
{% set critical_rejected_slave_dict = {} %}
{% set warning_slave_dict = {} %}
{% set used_host_list = [] %}
{% for slave in sorted(instance_parameter_dict['slave-instance-list']) %}
{% for slave in sorted(instance_parameter_dict['slave-instance-list']
, key=operator_module.itemgetter('slave_reference')
) %}
{% set slave_error_list = [] %}
{% set slave_critical_error_list = [] %}
{% set slave_warning_list = [] %}
...
...
@@ -278,7 +278,7 @@ context =
{% if k in slave %}
{% set crt = slave.get(k, '') %}
{% set check_popen = popen([software_parameter_dict['openssl'], 'x509', '-noout']) %}
{% do check_popen.communicate(crt) %}
{% do check_popen.communicate(crt
.encode()
) %}
{% if check_popen.returncode != 0 %}
{% do slave_error_list.append('%s is invalid' % (k,)) %}
{% endif %}
...
...
@@ -296,8 +296,8 @@ context =
{% if slave.get('ssl_key') and slave.get('ssl_crt') %}
{% set key_popen = popen([software_parameter_dict['openssl'], 'rsa', '-noout', '-modulus']) %}
{% set crt_popen = popen([software_parameter_dict['openssl'], 'x509', '-noout', '-modulus']) %}
{% set key_modulus = key_popen.communicate(slave['ssl_key'])[0] | trim %}
{% set crt_modulus = crt_popen.communicate(slave['ssl_crt'])[0] | trim %}
{% set key_modulus = key_popen.communicate(slave['ssl_key']
.encode()
)[0] | trim %}
{% set crt_modulus = crt_popen.communicate(slave['ssl_crt']
.encode()
)[0] | trim %}
{% if not key_modulus or key_modulus != crt_modulus %}
{% do slave_error_list.append('slave ssl_key and ssl_crt does not match') %}
{% endif %}
...
...
@@ -334,7 +334,7 @@ context =
{% do warning_slave_dict.__setitem__(slave.get('slave_reference'), sorted(slave_warning_list)) %}
{% endif %}
{% endfor %}
{% do authorized_slave_list.sort() %}
{% do authorized_slave_list.sort(
key=operator_module.itemgetter('slave_reference')
) %}
[monitor-instance-parameter]
monitor-httpd-port = {{ master_partition_monitor_monitor_httpd_port }}
...
...
@@ -356,7 +356,7 @@ return = slave-instance-information-list monitor-base-url backend-client-csr-url
{%- do base_node_configuration_dict.__setitem__(key, slapparameter_dict[key]) %}
{%- endif %}
{%- endfor %}
{% for section, frontend_request in request_dict.ite
rite
ms() %}
{% for section, frontend_request in request_dict.items() %}
{% set state = frontend_request.get('state', '') %}
[{{section}}]
<= replicate
...
...
@@ -377,14 +377,14 @@ config-cluster-identification = {{ instance_parameter_dict['root-instance-title'
{# sort_keys are important in order to avoid shuffling parameters on each run #}
{% do node_configuration_dict.__setitem__(slave_list_name, json_module.dumps(authorized_slave_list, sort_keys=True)) %}
{% do node_configuration_dict.__setitem__("frontend-name", frontend_request.get('name')) %}
{%- for config_key, config_value in node_configuration_dict.ite
rite
ms() %}
{%- for config_key, config_value in node_configuration_dict.items() %}
config-{{ config_key }} = {{ dumps(config_value) }}
{% endfor -%}
{%- for config_key, config_value in base_node_configuration_dict.ite
rite
ms() %}
{%- for config_key, config_value in base_node_configuration_dict.items() %}
config-{{ config_key }} = {{ dumps(config_value) }}
{% endfor -%}
{% if frontend_request.get('sla') %}
{% for parameter, value in frontend_request.get('sla').ite
rite
ms() %}
{% for parameter, value in frontend_request.get('sla').items() %}
sla-{{ parameter }} = {{ value }}
{% endfor %}
{% endif %}
...
...
@@ -489,7 +489,7 @@ config-slave-list = {{ dumps(authorized_slave_list) }}
config-cluster-identification = {{ instance_parameter_dict['root-instance-title'] }}
{% set software_url_key = "-kedifa-software-release-url" %}
{% if s
lapparameter_dict.has_key(software_url_key)
%}
{% if s
oftware_url_key in slapparameter_dict
%}
software-url = {{ slapparameter_dict.pop(software_url_key) }}
{% else %}
software-url = ${slap-connection:software-release-url}
...
...
@@ -499,7 +499,7 @@ name = kedifa
return = slave-kedifa-information master-key-generate-auth-url master-key-upload-url master-key-download-url caucase-url kedifa-csr-url csr-certificate monitor-base-url
{% set sla_kedifa_key = "-sla-kedifa-" %}
{% set sla_kedifa_key_length = sla_kedifa_key | length %}
{% for key in
slapparameter_dict.keys(
) %}
{% for key in
list(slapparameter_dict.keys()
) %}
{% if key.startswith(sla_kedifa_key) %}
sla-{{ key[sla_kedifa_key_length:] }} = {{ slapparameter_dict.pop(key) }}
{% endif %}
...
...
software/caddy-frontend/instance-kedifa.cfg.in
View file @
60918e63
...
...
@@ -171,9 +171,14 @@ wrapper-path = ${directory:service}/expose-csr
hash-existing-files = ${buildout:directory}/software_release/buildout.cfg
[expose-csr-certificate-get]
recipe = collective.recipe.shelloutput
commands =
certificate = cat ${expose-csr-certificate:certificate}
recipe = slapos.recipe.build
certificate-file = ${expose-csr-certificate:certificate}
init =
import os
options['certificate'] = ''
if os.path.exists(options['certificate-file']):
with open(options['certificate-file'], 'r') as fh:
options['certificate'] = fh.read()
[jinja2-template-base]
recipe = slapos.recipe.template:jinja2
...
...
@@ -259,10 +264,8 @@ command =
update-command = ${:command}
[{{ slave_reference }}-auth-random]
recipe = collective.recipe.shelloutput
<= auth-random
file = {{ '${' + slave_reference }}-auth-random-generate:file}
commands =
passwd = cat ${:file} 2>/dev/null || echo "NotReadyYet"
{% endfor %}
...
...
@@ -273,11 +276,18 @@ command =
[ ! -f ${:file} ] && {{ software_parameter_dict['curl'] }}/bin/curl -s -g -X POST https://[${kedifa-config:ip}]:${kedifa-config:port}/reserve-id --cert ${kedifa-config:certificate} --cacert ${kedifa-config:ca-certificate} > ${:file}.tmp && mv ${:file}.tmp ${:file}
update-command = ${:command}
[auth-random]
recipe = slapos.recipe.build
init =
import os
options['passwd'] = 'NotReadyYet'
if os.path.exists(options['file']):
with open(options['file'], 'r') as fh:
options['passwd'] = fh.read()
[master-auth-random]
recipe = collective.recipe.shelloutput
<= auth-random
file = ${master-auth-random-generate:file}
commands =
passwd = cat ${:file} 2>/dev/null || echo "NotReadyYet"
[slave-kedifa-information]
recipe = slapos.cookbook:publish.serialised
...
...
software/caddy-frontend/instance.cfg.in
View file @
60918e63
...
...
@@ -34,7 +34,7 @@ replicate = dynamic-profile-caddy-replicate:output
kedifa = dynamic-profile-kedifa:output
[software-parameter-section]
{% for key,value in software_parameter_dict.ite
rite
ms() %}
{% for key,value in software_parameter_dict.items() %}
{{ key }} = {{ dumps(value) }}
{% endfor -%}
...
...
@@ -54,6 +54,7 @@ filename = instance-caddy-replicate.cfg
extra-context =
import subprocess_module subprocess
import functools_module functools
import operator_module operator
import validators validators
import caddyprofiledummy caddyprofiledummy
# Must match the key id in [switch-softwaretype] which uses this section.
...
...
software/caddy-frontend/software.cfg
View file @
60918e63
...
...
@@ -22,6 +22,9 @@ parts +=
caddyprofiledeps
kedifa
[python]
part = python3
[kedifa]
recipe = zc.recipe.egg
eggs =
...
...
@@ -57,7 +60,6 @@ recipe = zc.recipe.egg
eggs =
caddyprofiledeps
websockify
collective.recipe.shelloutput
[profile-common]
recipe = slapos.recipe.template:jinja2
...
...
software/caddy-frontend/templates/apache-custom-slave-list.cfg.in
View file @
60918e63
...
...
@@ -52,13 +52,13 @@ context =
{#- * setup defaults to simplify other profiles #}
{#- * stabilise values for backend #}
{%- for key, prefix in [('url', 'http_backend'), ('https-url', 'https_backend')] %}
{%- set parsed = url
parse_modul
e.urlparse(slave_instance.get(key, '').strip()) %}
{%- set parsed = url
lib_module.pars
e.urlparse(slave_instance.get(key, '').strip()) %}
{%- set info_dict = {'scheme': parsed.scheme, 'hostname': parsed.hostname, 'port': parsed.port or DEFAULT_PORT[parsed.scheme], 'path': parsed.path, 'fragment': parsed.fragment, 'query': parsed.query, 'netloc-list': slave_instance.get(key + '-netloc-list', '').split() } %}
{%- do slave_instance.__setitem__(prefix, info_dict) %}
{%- endfor %}
{%- do slave_instance.__setitem__('ssl_proxy_verify', ('' ~ slave_instance.get('ssl-proxy-verify', '')).lower() in TRUE_VALUES) %}
{%- for key, prefix in [('health-check-failover-url', 'http_backend'), ('health-check-failover-https-url', 'https_backend')] %}
{%- set parsed = url
parse_modul
e.urlparse(slave_instance.get(key, '').strip()) %}
{%- set parsed = url
lib_module.pars
e.urlparse(slave_instance.get(key, '').strip()) %}
{%- set info_dict = slave_instance[prefix] %}
{%- do info_dict.__setitem__('health-check-failover-scheme', parsed.scheme) %}
{%- do info_dict.__setitem__('health-check-failover-hostname', parsed.hostname) %}
...
...
@@ -189,7 +189,7 @@ context =
{%- do furled.set(password = '${'+ slave_password_section +':passwd}') %}
{%- do furled.set(path = slave_reference + '/') %}
{#- We unquote, as furl quotes automatically, but there is buildout value on purpose like ${...:...} in the passwod #}
{%- set slave_log_access_url = url
parse_modul
e.unquote(furled.tostr()) %}
{%- set slave_log_access_url = url
lib_module.pars
e.unquote(furled.tostr()) %}
{%- do slave_publish_dict.__setitem__('log-access', slave_log_access_url) %}
{%- do slave_publish_dict.__setitem__('slave-reference', slave_reference) %}
{%- do slave_publish_dict.__setitem__('backend-client-caucase-url', backend_client_caucase_url) %}
...
...
@@ -212,7 +212,7 @@ context =
{%- for websocket_path in slave_instance.get('websocket-path-list', '').split() %}
{%- set websocket_path = websocket_path.strip('/') %}
{#- Unquote the path, so %20 and similar can be represented correctly #}
{%- set websocket_path = urllib_module.unquote(websocket_path.strip()) %}
{%- set websocket_path = urllib_module.
parse.
unquote(websocket_path.strip()) %}
{%- if websocket_path %}
{%- do websocket_path_list.append(websocket_path) %}
{%- endif %}
...
...
@@ -332,7 +332,7 @@ http_port = {{ dumps('' ~ configuration['plain_http_port']) }}
local_ipv4 = {{ dumps('' ~ instance_parameter_dict['ipv4-random']) }}
version-hash = {{ version_hash }}
node-id = {{ node_id }}
{%- for key, value in slave_instance.ite
rite
ms() %}
{%- for key, value in slave_instance.items() %}
{%- if value is not none %}
{{ key }} = {{ dumps(value) }}
{%- endif %}
...
...
@@ -383,7 +383,7 @@ config-frequency = 720
{%- do part_list.append(publish_section_title) %}
[{{ publish_section_title }}]
recipe = slapos.cookbook:publish
{%- for key, value in slave_publish_dict.ite
rite
ms() %}
{%- for key, value in slave_publish_dict.items() %}
{{ key }} = {{ value }}
{%- endfor %}
{%- else %}
...
...
@@ -463,7 +463,7 @@ csr-certificate = ${expose-csr-certificate-get:certificate}
{%- do furled.set(password = backend_haproxy_configuration['statistic-password']) %}
{%- do furled.set(path = '/') %}
{#- We unquote, as furl quotes automatically, but there is buildout value on purpose like ${...:...} in the passwod #}
{%- set statistic_url = url
parse_modul
e.unquote(furled.tostr()) %}
{%- set statistic_url = url
lib_module.pars
e.unquote(furled.tostr()) %}
backend-haproxy-statistic-url = {{ statistic_url }}
{#- sort_keys are important in order to avoid shuffling parameters on each run #}
node-information-json = {{ json_module.dumps(node_information, sort_keys=True) }}
...
...
@@ -503,7 +503,7 @@ output = ${:file}
< = jinja2-template-base
url = {{ template_backend_haproxy_configuration }}
output = ${backend-haproxy-config:file}
backend_slave_list = {{ dumps(sorted(backend_slave_list)) }}
backend_slave_list = {{ dumps(sorted(backend_slave_list
, key=operator_module.itemgetter('slave_reference')
)) }}
extra-context =
key backend_slave_list :backend_slave_list
section configuration backend-haproxy-config
...
...
@@ -611,9 +611,14 @@ wrapper-path = {{ directory['service'] }}/expose-csr
hash-existing-files = ${buildout:directory}/software_release/buildout.cfg
[expose-csr-certificate-get]
recipe = collective.recipe.shelloutput
commands =
certificate = cat ${expose-csr-certificate:certificate}
recipe = slapos.recipe.build
certificate-file = ${expose-csr-certificate:certificate}
init =
import os
options['certificate'] = ''
if os.path.exists(options['certificate-file']):
with open(options['certificate-file'], 'r') as fh:
options['certificate'] = fh.read()
[promise-logrotate-setup]
<= monitor-promise-base
...
...
software/caddy-frontend/templates/replicate-publish-slave-information.cfg.in
View file @
60918e63
...
...
@@ -2,7 +2,7 @@
{% set slave_information_dict = {} %}
# regroup slave information from all frontends
{% for frontend, slave_list_raw in slave_information.ite
rite
ms() %}
{% for frontend, slave_list_raw in slave_information.items() %}
{% if slave_list_raw %}
{% set slave_list = json_module.loads(slave_list_raw) %}
{% else %}
...
...
@@ -27,21 +27,21 @@
{% endfor %}
{% endfor %}
{% for slave_reference, rejected_info_list in rejected_slave_information['rejected-slave-dict'].ite
rite
ms() %}
{% for slave_reference, rejected_info_list in rejected_slave_information['rejected-slave-dict'].items() %}
{% if slave_reference not in slave_information_dict %}
{% do slave_information_dict.__setitem__(slave_reference, {}) %}
{% endif %}
{% do slave_information_dict[slave_reference].__setitem__('request-error-list', json_module.dumps(rejected_info_list)) %}
{% endfor %}
{% for slave_reference, warning_info_list in warning_slave_information['warning-slave-dict'].ite
rite
ms() %}
{% for slave_reference, warning_info_list in warning_slave_information['warning-slave-dict'].items() %}
{% if slave_reference not in slave_information_dict %}
{% do slave_information_dict.__setitem__(slave_reference, {}) %}
{% endif %}
{% do slave_information_dict[slave_reference].__setitem__('warning-list', json_module.dumps(warning_info_list)) %}
{% endfor %}
{% for slave_reference, kedifa_dict in json_module.loads(slave_kedifa_information).ite
rite
ms() %}
{% for slave_reference, kedifa_dict in json_module.loads(slave_kedifa_information).items() %}
{% if slave_reference not in rejected_slave_information['rejected-slave-dict'] %}
{% if slave_reference not in slave_information_dict %}
{% do slave_information_dict.__setitem__(slave_reference, {}) %}
...
...
@@ -54,7 +54,7 @@
# Publish information for each slave
{% set active_slave_instance_list = json_module.loads(active_slave_instance_dict['active-slave-instance-list']) %}
{% for slave_reference, slave_information in slave_information_dict.ite
rite
ms() %}
{% for slave_reference, slave_information in slave_information_dict.items() %}
{# Filter out destroyed, so not existing anymore, slaves #}
{# Note: This functionality is not yet covered by tests, please modify with care #}
{% if slave_reference in active_slave_instance_list %}
...
...
@@ -68,11 +68,11 @@ recipe = slapos.cookbook:publish
{# sort_keys are important in order to avoid shuffling parameters on each run #}
log-access-url = {{ dumps(json_module.dumps(log_access_url, sort_keys=True)) }}
{% endif %}
{% for key, value in slave_information.ite
rite
ms() %}
{% for key, value in slave_information.items() %}
{{ key }} = {{ dumps(value) }}
{% endfor %}
{% endif %}
{% for frontend_key, frontend_value in frontend_information.ite
rite
ms() %}
{% for frontend_key, frontend_value in frontend_information.items() %}
{{ frontend_key }} = {{ frontend_value }}
{% endfor %}
{% endfor %}
...
...
software/caddy-frontend/templates/slave-introspection-httpd-nginx.conf.in
View file @
60918e63
...
...
@@ -23,7 +23,7 @@ http {
fastcgi_temp_path {{ parameter_dict['var'] }} 1 2;
uwsgi_temp_path {{ parameter_dict['var'] }} 1 2;
scgi_temp_path {{ parameter_dict['var'] }} 1 2;
{% for slave, directory in slave_log_directory.ite
rite
ms() %}
{% for slave, directory in slave_log_directory.items() %}
location /{{ slave }} {
alias {{ directory }};
autoindex on;
...
...
software/caddy-frontend/test/test.py
View file @
60918e63
...
...
@@ -28,26 +28,26 @@
import
glob
import
os
import
requests
import
http
lib
import
http
.client
from
requests_toolbelt.adapters
import
source
import
json
import
multiprocessing
import
subprocess
from
unittest
import
skip
import
ssl
from
BaseHTTPS
erver
import
HTTPServer
from
BaseHTTPS
erver
import
BaseHTTPRequestHandler
from
SocketS
erver
import
ThreadingMixIn
from
http.s
erver
import
HTTPServer
from
http.s
erver
import
BaseHTTPRequestHandler
from
sockets
erver
import
ThreadingMixIn
import
time
import
tempfile
import
ipaddress
import
StringIO
import
io
import
gzip
import
base64
import
re
from
slapos.recipe.librecipe
import
generateHashFromFiles
import
xml.etree.ElementTree
as
ET
import
urlparse
import
url
lib.
parse
import
socket
import
sys
import
logging
...
...
@@ -130,7 +130,7 @@ def patch_broken_pipe_error():
"""Monkey Patch BaseServer.handle_error to not write
a stacktrace to stderr on broken pipe.
https://stackoverflow.com/a/7913160"""
from
SocketS
erver
import
BaseServer
from
sockets
erver
import
BaseServer
handle_error
=
BaseServer
.
handle_error
...
...
@@ -162,10 +162,10 @@ def createKey():
def
createSelfSignedCertificate
(
name_list
):
key
,
key_pem
=
createKey
()
subject_alternative_name_list
=
x509
.
SubjectAlternativeName
(
[
x509
.
DNSName
(
unicode
(
q
))
for
q
in
name_list
]
[
x509
.
DNSName
(
str
(
q
))
for
q
in
name_list
]
)
subject
=
issuer
=
x509
.
Name
([
x509
.
NameAttribute
(
NameOID
.
COMMON_NAME
,
u
'Test Self Signed Certificate'
),
x509
.
NameAttribute
(
NameOID
.
COMMON_NAME
,
'Test Self Signed Certificate'
),
])
certificate
=
x509
.
CertificateBuilder
().
subject_name
(
subject
...
...
@@ -192,10 +192,10 @@ def createCSR(common_name, ip=None):
subject_alternative_name_list
=
[]
if
ip
is
not
None
:
subject_alternative_name_list
.
append
(
x509
.
IPAddress
(
ipaddress
.
ip_address
(
unicode
(
ip
)))
x509
.
IPAddress
(
ipaddress
.
ip_address
(
str
(
ip
)))
)
csr
=
x509
.
CertificateSigningRequestBuilder
().
subject_name
(
x509
.
Name
([
x509
.
NameAttribute
(
NameOID
.
COMMON_NAME
,
unicode
(
common_name
)),
x509
.
NameAttribute
(
NameOID
.
COMMON_NAME
,
str
(
common_name
)),
]))
if
len
(
subject_alternative_name_list
):
...
...
@@ -219,10 +219,10 @@ class CertificateAuthority(object):
public_key
=
self
.
key
.
public_key
()
builder
=
x509
.
CertificateBuilder
()
builder
=
builder
.
subject_name
(
x509
.
Name
([
x509
.
NameAttribute
(
NameOID
.
COMMON_NAME
,
unicode
(
common_name
)),
x509
.
NameAttribute
(
NameOID
.
COMMON_NAME
,
str
(
common_name
)),
]))
builder
=
builder
.
issuer_name
(
x509
.
Name
([
x509
.
NameAttribute
(
NameOID
.
COMMON_NAME
,
unicode
(
common_name
)),
x509
.
NameAttribute
(
NameOID
.
COMMON_NAME
,
str
(
common_name
)),
]))
builder
=
builder
.
not_valid_before
(
datetime
.
datetime
.
utcnow
()
-
datetime
.
timedelta
(
days
=
2
))
...
...
@@ -283,7 +283,7 @@ def isHTTP2(domain):
out
,
err
=
prc
.
communicate
()
assert
prc
.
returncode
==
0
,
"Problem running %r. Output:
\
n
%s
\
n
Error:
\
n
%s"
%
(
curl_command
,
out
,
err
)
return
'Using HTTP2, server supports'
in
err
return
'Using HTTP2, server supports'
.
encode
()
in
err
class
TestDataMixin
(
object
):
...
...
@@ -305,7 +305,7 @@ class TestDataMixin(object):
except
IOError
:
test_data
=
''
for
hash_type
,
hash_value
in
hash_value_dict
.
items
(
):
for
hash_type
,
hash_value
in
list
(
hash_value_dict
.
items
()
):
runtime_data
=
runtime_data
.
replace
(
hash_value
,
'{hash-%s}'
%
(
hash_type
),)
...
...
@@ -321,7 +321,8 @@ class TestDataMixin(object):
)
except
AssertionError
:
if
os
.
environ
.
get
(
'SAVE_TEST_DATA'
,
'0'
)
==
'1'
:
open
(
test_data_file
,
'w'
).
write
(
runtime_data
.
strip
()
+
'
\
n
'
)
with
open
(
test_data_file
,
'w'
)
as
fh
:
fh
.
write
(
runtime_data
.
strip
()
+
'
\
n
'
)
raise
finally
:
self
.
maxDiff
=
maxDiff
...
...
@@ -510,26 +511,34 @@ class TestHandler(BaseHTTPRequestHandler):
self
.
wfile
.
write
(
json
.
dumps
({
self
.
path
:
config
},
indent
=
2
))
def
do_PUT
(
self
):
incoming_config
=
{}
for
key
,
value
in
list
(
self
.
headers
.
items
()):
if
key
.
startswith
(
'X-'
):
incoming_config
[
key
]
=
value
config
=
{
'status_code'
:
self
.
headers
.
dict
.
get
(
'x-reply-status-c
ode'
,
'200'
)
'status_code'
:
incoming_config
.
pop
(
'X-Reply-Status-C
ode'
,
'200'
)
}
prefix
=
'
x-reply-h
eader-'
prefix
=
'
X-Reply-H
eader-'
length
=
len
(
prefix
)
for
key
,
value
in
self
.
headers
.
dict
.
items
(
):
for
key
in
list
(
incoming_config
.
keys
()
):
if
key
.
startswith
(
prefix
):
header
=
'-'
.
join
([
q
.
capitalize
()
for
q
in
key
[
length
:].
split
(
'-'
)])
config
[
header
]
=
value
.
strip
(
)
config
[
header
]
=
incoming_config
.
pop
(
key
)
if
'x-reply-body'
in
self
.
headers
.
dict
:
config
[
'Body'
]
=
base64
.
b64decode
(
self
.
headers
.
dict
[
'x-reply-body'
])
if
'X-Reply-Body'
in
incoming_config
:
config
[
'Body'
]
=
base64
.
b64decode
(
incoming_config
.
pop
(
'X-Reply-Body'
)).
decode
()
config
[
'X-Drop-Header'
]
=
self
.
headers
.
dict
.
get
(
'x-drop-header'
)
config
[
'X-Drop-Header'
]
=
incoming_config
.
pop
(
'X-Drop-Header'
,
None
)
self
.
configuration
[
self
.
path
]
=
config
self
.
send_response
(
201
)
self
.
send_header
(
"Content-Type"
,
"application/json"
)
self
.
end_headers
()
self
.
wfile
.
write
(
json
.
dumps
({
self
.
path
:
config
},
indent
=
2
))
reply
=
{
self
.
path
:
config
}
if
incoming_config
:
reply
[
'unknown_config'
]
=
incoming_config
self
.
wfile
.
write
(
json
.
dumps
(
reply
,
indent
=
2
).
encode
())
def
do_POST
(
self
):
return
self
.
do_GET
()
...
...
@@ -548,33 +557,33 @@ class TestHandler(BaseHTTPRequestHandler):
header_dict
=
config
else
:
drop_header_list
=
[]
for
header
in
(
self
.
headers
.
dict
.
get
(
'x-drop-header'
)
or
''
).
split
():
for
header
in
(
self
.
headers
.
get
(
'x-drop-header'
)
or
''
).
split
():
drop_header_list
.
append
(
header
)
response
=
None
status_code
=
200
timeout
=
int
(
self
.
headers
.
dict
.
get
(
'timeout'
,
'0'
))
if
'x-maximum-timeout'
in
self
.
headers
.
dict
:
maximum_timeout
=
int
(
self
.
headers
.
dict
[
'x-maximum-timeout'
])
timeout
=
int
(
self
.
headers
.
get
(
'timeout'
,
'0'
))
if
'x-maximum-timeout'
in
self
.
headers
:
maximum_timeout
=
int
(
self
.
headers
[
'x-maximum-timeout'
])
timeout
=
random
.
randrange
(
maximum_timeout
)
if
'x-response-size'
in
self
.
headers
.
dict
:
if
'x-response-size'
in
self
.
headers
:
min_response
,
max_response
=
[
int
(
q
)
for
q
in
self
.
headers
.
dict
[
'x-response-size'
].
split
(
' '
)]
int
(
q
)
for
q
in
self
.
headers
[
'x-response-size'
].
split
(
' '
)]
reponse_size
=
random
.
randrange
(
min_response
,
max_response
)
response
=
''
.
join
(
random
.
choice
(
string
.
lowercase
)
for
x
in
range
(
reponse_size
))
compress
=
int
(
self
.
headers
.
dict
.
get
(
'compress'
,
'0'
))
compress
=
int
(
self
.
headers
.
get
(
'compress'
,
'0'
))
header_dict
=
{}
prefix
=
'x-reply-header-'
length
=
len
(
prefix
)
for
key
,
value
in
self
.
headers
.
dict
.
items
(
):
for
key
,
value
in
list
(
self
.
headers
.
items
()
):
if
key
.
startswith
(
prefix
):
header
=
'-'
.
join
([
q
.
capitalize
()
for
q
in
key
[
length
:].
split
(
'-'
)])
header_dict
[
header
]
=
value
.
strip
()
if
response
is
None
:
if
'x-reply-body'
not
in
self
.
headers
.
dict
:
if
'x-reply-body'
not
in
self
.
headers
:
headers_dict
=
dict
()
for
header
in
self
.
headers
.
keys
(
):
content
=
self
.
headers
.
get
headers
(
header
)
for
header
in
list
(
self
.
headers
.
keys
()
):
content
=
self
.
headers
.
get
_all
(
header
)
if
len
(
content
)
==
0
:
headers_dict
[
header
]
=
None
elif
len
(
content
)
==
1
:
...
...
@@ -587,12 +596,12 @@ class TestHandler(BaseHTTPRequestHandler):
}
response
=
json
.
dumps
(
response
,
indent
=
2
)
else
:
response
=
base64
.
b64decode
(
self
.
headers
.
dict
[
'x-reply-body'
])
response
=
base64
.
b64decode
(
self
.
headers
[
'x-reply-body'
])
time
.
sleep
(
timeout
)
self
.
send_response
(
status_code
)
for
key
,
value
in
header_dict
.
items
(
):
for
key
,
value
in
list
(
header_dict
.
items
()
):
self
.
send_header
(
key
,
value
)
if
self
.
identification
is
not
None
:
...
...
@@ -608,16 +617,18 @@ class TestHandler(BaseHTTPRequestHandler):
self
.
send_header
(
'Via'
,
'http/1.1 backendvia'
)
if
compress
:
self
.
send_header
(
'Content-Encoding'
,
'gzip'
)
out
=
StringIO
.
String
IO
()
out
=
io
.
Bytes
IO
()
# compress with level 0, to find out if in the middle someting would
# like to alter the compression
with
gzip
.
GzipFile
(
fileobj
=
out
,
mode
=
"w"
,
compresslevel
=
0
)
as
f
:
f
.
write
(
response
)
with
gzip
.
GzipFile
(
fileobj
=
out
,
mode
=
"w
b
"
,
compresslevel
=
0
)
as
f
:
f
.
write
(
response
.
encode
()
)
response
=
out
.
getvalue
()
self
.
send_header
(
'Backend-Content-Length'
,
len
(
response
))
if
'Content-Length'
not
in
drop_header_list
:
self
.
send_header
(
'Content-Length'
,
len
(
response
))
self
.
end_headers
()
if
getattr
(
response
,
'encode'
,
None
)
is
not
None
:
response
=
response
.
encode
()
self
.
wfile
.
write
(
response
)
...
...
@@ -717,7 +728,7 @@ class HttpFrontendTestCase(SlapOSInstanceTestCase):
master_parameter_dict
=
self
.
parseConnectionParameterDict
()
caucase_url
=
master_parameter_dict
[
'backend-client-caucase-url'
]
ca_certificate
=
requests
.
get
(
caucase_url
+
'/cas/crt/ca.crt.pem'
)
assert
ca_certificate
.
status_code
==
http
lib
.
OK
assert
ca_certificate
.
status_code
==
http
.
client
.
OK
ca_certificate_file
=
os
.
path
.
join
(
self
.
working_directory
,
'ca-backend-client.crt.pem'
)
with
open
(
ca_certificate_file
,
'w'
)
as
fh
:
...
...
@@ -759,7 +770,7 @@ class HttpFrontendTestCase(SlapOSInstanceTestCase):
def
_fetchKedifaCaucaseCaCertificateFile
(
cls
,
parameter_dict
):
ca_certificate
=
requests
.
get
(
parameter_dict
[
'kedifa-caucase-url'
]
+
'/cas/crt/ca.crt.pem'
)
assert
ca_certificate
.
status_code
==
http
lib
.
OK
assert
ca_certificate
.
status_code
==
http
.
client
.
OK
cls
.
kedifa_caucase_ca_certificate_file
=
os
.
path
.
join
(
cls
.
working_directory
,
'kedifa-caucase.ca.crt.pem'
)
open
(
cls
.
kedifa_caucase_ca_certificate_file
,
'w'
).
write
(
...
...
@@ -769,7 +780,7 @@ class HttpFrontendTestCase(SlapOSInstanceTestCase):
def
_fetchBackendClientCaCertificateFile
(
cls
,
parameter_dict
):
ca_certificate
=
requests
.
get
(
parameter_dict
[
'backend-client-caucase-url'
]
+
'/cas/crt/ca.crt.pem'
)
assert
ca_certificate
.
status_code
==
http
lib
.
OK
assert
ca_certificate
.
status_code
==
http
.
client
.
OK
cls
.
backend_client_caucase_ca_certificate_file
=
os
.
path
.
join
(
cls
.
working_directory
,
'backend-client-caucase.ca.crt.pem'
)
open
(
cls
.
backend_client_caucase_ca_certificate_file
,
'w'
).
write
(
...
...
@@ -785,12 +796,12 @@ class HttpFrontendTestCase(SlapOSInstanceTestCase):
auth
=
requests
.
get
(
parameter_dict
[
'master-key-generate-auth-url'
],
verify
=
cls
.
kedifa_caucase_ca_certificate_file
)
assert
auth
.
status_code
==
http
lib
.
CREATED
assert
auth
.
status_code
==
http
.
client
.
CREATED
upload
=
requests
.
put
(
parameter_dict
[
'master-key-upload-url'
]
+
auth
.
text
,
data
=
cls
.
key_pem
+
cls
.
certificate_pem
,
verify
=
cls
.
kedifa_caucase_ca_certificate_file
)
assert
upload
.
status_code
==
http
lib
.
CREATED
assert
upload
.
status_code
==
http
.
client
.
CREATED
cls
.
runKedifaUpdater
()
@
classmethod
...
...
@@ -891,7 +902,7 @@ class HttpFrontendTestCase(SlapOSInstanceTestCase):
via_id
=
'%s-%s'
%
(
self
.
node_information_dict
[
'node-id'
],
self
.
node_information_dict
[
'version-hash-history'
].
keys
(
)[
0
])
list
(
self
.
node_information_dict
[
'version-hash-history'
].
keys
()
)[
0
])
if
via
:
self
.
assertIn
(
'Via'
,
headers
)
if
cached
:
...
...
@@ -925,7 +936,7 @@ class HttpFrontendTestCase(SlapOSInstanceTestCase):
frontend
,
url
=
entry
result
=
requests
.
get
(
url
,
verify
=
False
)
self
.
assertEqual
(
http
lib
.
OK
,
http
.
client
.
OK
,
result
.
status_code
,
'While accessing %r of %r the status code was %r'
%
(
url
,
frontend
,
result
.
status_code
))
...
...
@@ -939,11 +950,11 @@ class HttpFrontendTestCase(SlapOSInstanceTestCase):
sorted
([
q
[
'name'
]
for
q
in
result
.
json
()]),
[
'access.log'
,
'backend.log'
,
'error.log'
])
self
.
assertEqual
(
http
lib
.
OK
,
http
.
client
.
OK
,
requests
.
get
(
url
+
'access.log'
,
verify
=
False
).
status_code
)
self
.
assertEqual
(
http
lib
.
OK
,
http
.
client
.
OK
,
requests
.
get
(
url
+
'error.log'
,
verify
=
False
).
status_code
)
# assert only for few tests, as backend log is not available for many of
...
...
@@ -952,7 +963,7 @@ class HttpFrontendTestCase(SlapOSInstanceTestCase):
'test_url'
,
'test_auth_to_backend'
,
'test_compressed_result'
]:
if
self
.
id
().
endswith
(
test_name
):
self
.
assertEqual
(
http
lib
.
OK
,
http
.
client
.
OK
,
requests
.
get
(
url
+
'backend.log'
,
verify
=
False
).
status_code
)
...
...
@@ -963,11 +974,11 @@ class HttpFrontendTestCase(SlapOSInstanceTestCase):
kedifa_ipv6_base
=
'https://[%s]:%s'
%
(
self
.
_ipv6_address
,
KEDIFA_PORT
)
base
=
'^'
+
kedifa_ipv6_base
.
replace
(
'['
,
r'\
[
').replace('
]
', r'
\
]
') + '
/
.{
32
}
'
self.assertRegex
pMatches
(
self.assertRegex(
generate_auth_url,
base + r'
\
/
generateauth
$
'
)
self.assertRegex
pMatches
(
self.assertRegex(
upload_url,
base + r'
\
?
auth
=
$
'
)
...
...
@@ -983,13 +994,13 @@ class HttpFrontendTestCase(SlapOSInstanceTestCase):
def assertNodeInformationWithPop(self, parameter_dict):
key = '
caddy
-
frontend
-
1
-
node
-
information
-
json
'
node_information_json_dict = {}
for k in
parameter_dict.keys(
):
for k in
list(parameter_dict.keys()
):
if k.startswith('
caddy
-
frontend
') and k.endswith(
'
node
-
information
-
json
'):
node_information_json_dict[k] = parameter_dict.pop(k)
self.assertEqual(
[key],
node_information_json_dict.keys(
)
list(node_information_json_dict.keys()
)
)
node_information_dict = json.loads(node_information_json_dict[key])
...
...
@@ -1000,13 +1011,13 @@ class HttpFrontendTestCase(SlapOSInstanceTestCase):
def assertBackendHaproxyStatisticUrl(self, parameter_dict):
url_key = '
caddy
-
frontend
-
1
-
backend
-
haproxy
-
statistic
-
url
'
backend_haproxy_statistic_url_dict = {}
for key in
parameter_dict.keys(
):
for key in
list(parameter_dict.keys()
):
if key.startswith('
caddy
-
frontend
') and key.endswith(
'
backend
-
haproxy
-
statistic
-
url
'):
backend_haproxy_statistic_url_dict[key] = parameter_dict.pop(key)
self.assertEqual(
[url_key],
backend_haproxy_statistic_url_dict.keys(
)
list(backend_haproxy_statistic_url_dict.keys()
)
)
backend_haproxy_statistic_url = backend_haproxy_statistic_url_dict[url_key]
...
...
@@ -1014,7 +1025,7 @@ class HttpFrontendTestCase(SlapOSInstanceTestCase):
backend_haproxy_statistic_url,
verify=False,
)
self.assertEqual(http
lib
.OK, result.status_code)
self.assertEqual(http
.client
.OK, result.status_code)
self.assertIn('
testing
partition
0
', result.text)
self.assertIn('
Statistics
Report
for
HAProxy
', result.text)
...
...
@@ -1075,7 +1086,7 @@ class HttpFrontendTestCase(SlapOSInstanceTestCase):
def parseParameterDict(self, parameter_dict):
parsed_parameter_dict = {}
for key, value in
parameter_dict.items(
):
for key, value in
list(parameter_dict.items()
):
if key in [
'
rejected
-
slave
-
dict
',
'
warning
-
slave
-
dict
',
...
...
@@ -1218,8 +1229,8 @@ class SlaveHttpFrontendTestCase(HttpFrontendTestCase):
@
classmethod
def
requestSlaves
(
cls
):
for
slave_reference
,
partition_parameter_kw
in
cls
\
.
getSlaveParameterDictDict
().
items
(
):
for
slave_reference
,
partition_parameter_kw
in
list
(
cls
.
getSlaveParameterDictDict
().
items
()
):
software_url
=
cls
.
getSoftwareURL
()
software_type
=
cls
.
getInstanceSoftwareType
()
cls
.
logger
.
debug
(
...
...
@@ -1265,8 +1276,8 @@ class SlaveHttpFrontendTestCase(HttpFrontendTestCase):
def
getSlaveConnectionParameterDictList
(
cls
):
parameter_dict_list
=
[]
for
slave_reference
,
partition_parameter_kw
in
cls
\
.
getSlaveParameterDictDict
().
items
(
):
for
slave_reference
,
partition_parameter_kw
in
list
(
cls
.
getSlaveParameterDictDict
().
items
()
):
parameter_dict_list
.
append
(
cls
.
requestSlaveInstance
(
partition_reference
=
slave_reference
,
partition_parameter_kw
=
partition_parameter_kw
,
...
...
@@ -1303,8 +1314,8 @@ class SlaveHttpFrontendTestCase(HttpFrontendTestCase):
def
updateSlaveConnectionParameterDictDict
(
cls
):
cls
.
slave_connection_parameter_dict_dict
=
{}
# run partition for slaves to be setup
for
slave_reference
,
partition_parameter_kw
in
cls
\
.
getSlaveParameterDictDict
().
items
(
):
for
slave_reference
,
partition_parameter_kw
in
list
(
cls
.
getSlaveParameterDictDict
().
items
()
):
slave_instance
=
cls
.
requestSlaveInstance
(
partition_reference
=
slave_reference
,
partition_parameter_kw
=
partition_parameter_kw
,
...
...
@@ -1329,7 +1340,7 @@ class SlaveHttpFrontendTestCase(HttpFrontendTestCase):
self
.
assertKedifaKeysWithPop
(
parameter_dict
,
''
)
self
.
assertNodeInformationWithPop
(
parameter_dict
)
if
hostname
is
None
:
hostname
=
reference
.
translate
(
None
,
'_-
'
).
lower
()
hostname
=
reference
.
replace
(
'_'
,
''
).
replace
(
'-'
,
'
'
).
lower
()
expected_parameter_dict
.
update
(
**
{
'domain'
:
'%s.example.com'
%
(
hostname
,),
'replication_number'
:
'1'
,
...
...
@@ -1351,7 +1362,7 @@ class SlaveHttpFrontendTestCase(HttpFrontendTestCase):
self
.
instance_path
,
'*'
,
'var'
,
'log'
,
'httpd'
,
log_name
))[
0
]
self
.
assertRegex
pMatches
(
self
.
assertRegex
(
open
(
log_file
,
'r'
).
readlines
()[
-
1
],
log_regexp
)
...
...
@@ -1477,11 +1488,11 @@ class TestMasterAIKCDisabledAIBCCDisabledRequest(
backend_client_caucase_url
,
backend_client_ca_pem
,
backend_client_csr_pem
)
kedifa_key_file
=
os
.
path
.
join
(
cls
.
working_directory
,
'kedifa-key.pem'
)
with
open
(
kedifa_key_file
,
'w'
)
as
fh
:
with
open
(
kedifa_key_file
,
'w
b
'
)
as
fh
:
fh
.
write
(
kedifa_crt_pem
+
kedifa_key_pem
)
backend_client_key_file
=
os
.
path
.
join
(
cls
.
working_directory
,
'backend-client-key.pem'
)
with
open
(
backend_client_key_file
,
'w'
)
as
fh
:
with
open
(
backend_client_key_file
,
'w
b
'
)
as
fh
:
fh
.
write
(
backend_client_crt_pem
+
backend_client_key_pem
)
# Simulate human: create service keys
...
...
@@ -1951,13 +1962,13 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
with
lzma
.
open
(
os
.
path
.
join
(
ats_logrotate_dir
,
old_file_name
+
'.xz'
))
as
fh
:
self
.
assertEqual
(
'old'
,
'old'
.
encode
()
,
fh
.
read
()
)
with
lzma
.
open
(
os
.
path
.
join
(
ats_logrotate_dir
,
older_file_name
+
'.xz'
))
as
fh
:
self
.
assertEqual
(
'older'
,
'older'
.
encode
()
,
fh
.
read
()
)
...
...
@@ -2074,12 +2085,12 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
self
.
certificate_pem
,
der2pem
(
result
.
peercert
))
self
.
assertEqual
(
http
lib
.
SERVICE_UNAVAILABLE
,
result
.
status_code
)
self
.
assertEqual
(
http
.
client
.
SERVICE_UNAVAILABLE
,
result
.
status_code
)
result_http
=
fakeHTTPResult
(
parameter_dict
[
'domain'
],
'test-path'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result_http
.
status_code
)
...
...
@@ -2091,7 +2102,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
# check that 404 is as configured
result_missing
=
fakeHTTPSResult
(
'forsuredoesnotexists.example.com'
,
''
)
self
.
assertEqual
(
http
lib
.
NOT_FOUND
,
result_missing
.
status_code
)
self
.
assertEqual
(
http
.
client
.
NOT_FOUND
,
result_missing
.
status_code
)
self
.
assertEqual
(
"""<html>
<head>
...
...
@@ -2152,7 +2163,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
)
via_id
=
'%s-%s'
%
(
self
.
node_information_dict
[
'node-id'
],
self
.
node_information_dict
[
'version-hash-history'
].
keys
(
)[
0
])
list
(
self
.
node_information_dict
[
'version-hash-history'
].
keys
()
)[
0
])
if
cached
:
self
.
assertEqual
(
[
...
...
@@ -2247,7 +2258,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
'
test
-
path
/
deep
/
..
/
.
/
deeper
')
self.assertEqual(
http
lib
.FOUND,
http
.client
.FOUND,
result_http.status_code
)
...
...
@@ -2368,7 +2379,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
self
.
assertEqual
(
result
.
status_code
,
http
lib
.
BAD_GATEWAY
http
.
client
.
BAD_GATEWAY
)
finally
:
self
.
stopAuthenticatedServerProcess
()
...
...
@@ -2410,7 +2421,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
'test-path/deep/.././deeper'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result_http
.
status_code
)
...
...
@@ -2553,7 +2564,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
der2pem
(
result
.
peercert
))
self
.
assertEqual
(
http
lib
.
MOVED_PERMANENTLY
,
http
.
client
.
MOVED_PERMANENTLY
,
result
.
status_code
)
...
...
@@ -2709,7 +2720,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
auth
=
requests
.
get
(
self
.
current_generate_auth
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
auth
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
auth
.
status_code
)
data
=
self
.
customdomain_ca_certificate_pem
+
\
self
.
customdomain_ca_key_pem
+
\
...
...
@@ -2719,7 +2730,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
self
.
current_upload_url
+
auth
.
text
,
data
=
data
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
upload
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
upload
.
status_code
)
self
.
runKedifaUpdater
()
result
=
fakeHTTPSResult
(
...
...
@@ -2736,7 +2747,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
'_custom_domain_ssl_crt_ssl_key_ssl_ca_crt.pem'
))
self
.
assertEqual
(
1
,
len
(
certificate_file_list
))
certificate_file
=
certificate_file_list
[
0
]
with
open
(
certificate_file
)
as
out
:
with
open
(
certificate_file
,
'rb'
)
as
out
:
self
.
assertEqual
(
data
,
out
.
read
())
def
test_ssl_ca_crt_only
(
self
):
...
...
@@ -2745,7 +2756,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
auth
=
requests
.
get
(
self
.
current_generate_auth
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
auth
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
auth
.
status_code
)
data
=
self
.
ca
.
certificate_pem
...
...
@@ -2754,7 +2765,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
data
=
data
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
UNPROCESSABLE_ENTITY
,
upload
.
status_code
)
self
.
assertEqual
(
http
.
client
.
UNPROCESSABLE_ENTITY
,
upload
.
status_code
)
self
.
assertEqual
(
'Key incorrect'
,
upload
.
text
)
def
test_ssl_ca_crt_garbage
(
self
):
...
...
@@ -2764,19 +2775,19 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
auth
=
requests
.
get
(
self
.
current_generate_auth
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
auth
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
auth
.
status_code
)
_
,
ca_key_pem
,
csr
,
_
=
createCSR
(
parameter_dict
[
'domain'
])
_
,
ca_certificate_pem
=
self
.
ca
.
signCSR
(
csr
)
data
=
ca_certificate_pem
+
ca_key_pem
+
'some garbage'
data
=
ca_certificate_pem
+
ca_key_pem
+
'some garbage'
.
encode
()
upload
=
requests
.
put
(
self
.
current_upload_url
+
auth
.
text
,
data
=
data
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
upload
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
upload
.
status_code
)
self
.
runKedifaUpdater
()
result
=
fakeHTTPSResult
(
...
...
@@ -2794,7 +2805,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
'_ssl_ca_crt_garbage.pem'
))
self
.
assertEqual
(
1
,
len
(
certificate_file_list
))
certificate_file
=
certificate_file_list
[
0
]
with
open
(
certificate_file
)
as
out
:
with
open
(
certificate_file
,
'rb'
)
as
out
:
self
.
assertEqual
(
data
,
out
.
read
())
def
test_ssl_ca_crt_does_not_match
(
self
):
...
...
@@ -2803,7 +2814,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
auth
=
requests
.
get
(
self
.
current_generate_auth
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
auth
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
auth
.
status_code
)
data
=
self
.
certificate_pem
+
self
.
key_pem
+
self
.
ca
.
certificate_pem
...
...
@@ -2812,7 +2823,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
data
=
data
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
upload
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
upload
.
status_code
)
self
.
runKedifaUpdater
()
result
=
fakeHTTPSResult
(
...
...
@@ -2829,7 +2840,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
'_ssl_ca_crt_does_not_match.pem'
))
self
.
assertEqual
(
1
,
len
(
certificate_file_list
))
certificate_file
=
certificate_file_list
[
0
]
with
open
(
certificate_file
)
as
out
:
with
open
(
certificate_file
,
'rb'
)
as
out
:
self
.
assertEqual
(
data
,
out
.
read
())
def
test_https_only
(
self
):
...
...
@@ -2908,14 +2919,14 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
auth
=
requests
.
get
(
self
.
current_generate_auth
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
auth
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
auth
.
status_code
)
data
=
self
.
customdomain_certificate_pem
+
\
self
.
customdomain_key_pem
upload
=
requests
.
put
(
self
.
current_upload_url
+
auth
.
text
,
data
=
data
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
upload
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
upload
.
status_code
)
self
.
runKedifaUpdater
()
result
=
fakeHTTPSResult
(
...
...
@@ -2956,7 +2967,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
'test-path/deep/.././deeper'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result
.
status_code
)
...
...
@@ -3075,7 +3086,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
'test-path/deep/.././deeper'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result
.
status_code
)
...
...
@@ -3116,7 +3127,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
headers
=
{
'Accept-Encoding'
:
'gzip, deflate'
})
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result
.
status_code
)
...
...
@@ -3236,7 +3247,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
j
=
result
.
json
()
except
Exception
:
raise
ValueError
(
'JSON decode problem in:
\
n
%s'
%
(
result
.
text
,))
parsed
=
urlparse
.
urlparse
(
self
.
backend_url
)
parsed
=
url
lib
.
parse
.
urlparse
(
self
.
backend_url
)
self
.
assertBackendHeaders
(
j
[
'Incoming Headers'
],
parsed
.
hostname
,
port
=
'17'
,
proto
=
'irc'
,
ignore_header_list
=
[
'Host'
])
...
...
@@ -3342,7 +3353,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
j
=
result
.
json
()
except
Exception
:
raise
ValueError
(
'JSON decode problem in:
\
n
%s'
%
(
result
.
text
,))
parsed
=
urlparse
.
urlparse
(
self
.
backend_url
)
parsed
=
url
lib
.
parse
.
urlparse
(
self
.
backend_url
)
self
.
assertBackendHeaders
(
j
[
'Incoming Headers'
],
parsed
.
hostname
,
port
=
'17'
,
proto
=
'irc'
,
ignore_header_list
=
[
'Host'
])
...
...
@@ -3408,7 +3419,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
der2pem
(
result
.
peercert
))
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result
.
status_code
)
...
...
@@ -3430,7 +3441,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
der2pem
(
result
.
peercert
))
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result
.
status_code
)
...
...
@@ -3451,7 +3462,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
der2pem
(
result
.
peercert
))
self
.
assertEqual
(
http
lib
.
SERVICE_UNAVAILABLE
,
http
.
client
.
SERVICE_UNAVAILABLE
,
result
.
status_code
)
...
...
@@ -3459,7 +3470,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
parameter_dict
[
'domain'
],
'test-path'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result_http
.
status_code
)
...
...
@@ -3498,7 +3509,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
parameter_dict
[
'domain'
],
'test-path'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result_http
.
status_code
)
...
...
@@ -3519,7 +3530,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
der2pem
(
result
.
peercert
))
self
.
assertEqual
(
http
lib
.
SERVICE_UNAVAILABLE
,
http
.
client
.
SERVICE_UNAVAILABLE
,
result
.
status_code
)
...
...
@@ -3533,12 +3544,12 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
self
.
certificate_pem
,
der2pem
(
result
.
peercert
))
self
.
assertEqual
(
http
lib
.
SERVICE_UNAVAILABLE
,
result
.
status_code
)
self
.
assertEqual
(
http
.
client
.
SERVICE_UNAVAILABLE
,
result
.
status_code
)
result_http
=
fakeHTTPResult
(
parameter_dict
[
'domain'
],
'test-path'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result_http
.
status_code
)
...
...
@@ -3570,12 +3581,12 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
self
.
certificate_pem
,
der2pem
(
result
.
peercert
))
self
.
assertEqual
(
http
lib
.
SERVICE_UNAVAILABLE
,
result
.
status_code
)
self
.
assertEqual
(
http
.
client
.
SERVICE_UNAVAILABLE
,
result
.
status_code
)
result_http
=
fakeHTTPResult
(
parameter_dict
[
'domain'
],
'test-path'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result_http
.
status_code
)
...
...
@@ -3608,13 +3619,13 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
self
.
certificate_pem
,
der2pem
(
result
.
peercert
))
self
.
assertEqual
(
http
lib
.
SERVICE_UNAVAILABLE
,
result
.
status_code
)
self
.
assertEqual
(
http
.
client
.
SERVICE_UNAVAILABLE
,
result
.
status_code
)
result_http
=
fakeHTTPResult
(
parameter_dict
[
'domain'
],
'test-path'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result_http
.
status_code
)
...
...
@@ -3696,7 +3707,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
'X-Reply-Header-Cache-Control'
:
'max-age=1, stale-while-'
'revalidate=3600, stale-if-error=3600'
})
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result
.
status_code
)
...
...
@@ -3735,7 +3746,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
'X-Reply-Header-Cache-Control'
:
'max-age=1, stale-while-'
'revalidate=3600, stale-if-error=3600'
})
self
.
assertEqual
(
http
lib
.
OK
,
result
.
status_code
)
self
.
assertEqual
(
http
.
client
.
OK
,
result
.
status_code
)
self
.
assertEqualResultJson
(
result
,
'Path'
,
'/HTTPS/test'
)
self
.
assertResponseHeaders
(
result
,
cached
=
True
)
...
...
@@ -3800,7 +3811,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
with
open
(
ats_log_file
)
as
fh
:
ats_log
=
fh
.
read
()
self
.
assertRegex
pMatches
(
ats_log
,
direct_pattern
)
self
.
assertRegex
(
ats_log
,
direct_pattern
)
# END: Check that squid.log is correctly filled in
def
_hack_ats
(
self
,
max_stale_age
):
...
...
@@ -3864,10 +3875,10 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
max_age
=
int
(
max_stale_age
/
2.
)
# body_200 is big enough to trigger
# https://github.com/apache/trafficserver/issues/7880
body_200
=
b
'Body 200'
*
500
body_502
=
b
'Body 502'
body_502_new
=
b
'Body 502 new'
body_200_new
=
b
'Body 200 new'
body_200
=
'Body 200'
*
500
body_502
=
'Body 502'
body_502_new
=
'Body 502 new'
body_200_new
=
'Body 200 new'
self
.
addCleanup
(
self
.
_unhack_ats
)
self
.
_hack_ats
(
max_stale_age
)
...
...
@@ -3877,12 +3888,12 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
result
=
requests
.
put
(
backend_url
+
path
,
headers
=
{
'X-Reply-Header-Cache-Control'
:
'max-age=%s, public'
%
(
max_age
,),
'X-Reply-Status-Code'
:
status_code
,
'X-Reply-Body'
:
base64
.
b64encode
(
body
),
'X-Reply-Body'
:
base64
.
b64encode
(
body
.
encode
()
),
# drop Content-Length header to ensure
# https://github.com/apache/trafficserver/issues/7880
'X-Drop-Header'
:
'Content-Length'
,
})
self
.
assertEqual
(
result
.
status_code
,
http
lib
.
CREATED
)
self
.
assertEqual
(
result
.
status_code
,
http
.
client
.
CREATED
)
def
checkResult
(
status_code
,
body
):
result
=
fakeHTTPSResult
(
...
...
@@ -3894,39 +3905,39 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
# backend returns something correctly
configureResult
(
'200'
,
body_200
)
checkResult
(
http
lib
.
OK
,
body_200
)
checkResult
(
http
.
client
.
OK
,
body_200
)
configureResult
(
'502'
,
body_502
)
time
.
sleep
(
1
)
# even if backend returns 502, ATS gives cached result
checkResult
(
http
lib
.
OK
,
body_200
)
checkResult
(
http
.
client
.
OK
,
body_200
)
# interesting moment, time is between max_age and max_stale_age, triggers
# https://github.com/apache/trafficserver/issues/7880
time
.
sleep
(
max_age
+
1
)
checkResult
(
http
lib
.
OK
,
body_200
)
checkResult
(
http
.
client
.
OK
,
body_200
)
# max_stale_age passed, time to return 502 from the backend
time
.
sleep
(
max_stale_age
+
2
)
checkResult
(
http
lib
.
BAD_GATEWAY
,
body_502
)
checkResult
(
http
.
client
.
BAD_GATEWAY
,
body_502
)
configureResult
(
'502'
,
body_502_new
)
time
.
sleep
(
1
)
# even if there is new negative response on the backend, the old one is
# served from the cache
checkResult
(
http
lib
.
BAD_GATEWAY
,
body_502
)
checkResult
(
http
.
client
.
BAD_GATEWAY
,
body_502
)
time
.
sleep
(
max_age
+
2
)
# now as max-age of negative response passed, the new one is served
checkResult
(
http
lib
.
BAD_GATEWAY
,
body_502_new
)
checkResult
(
http
.
client
.
BAD_GATEWAY
,
body_502_new
)
configureResult
(
'200'
,
body_200_new
)
time
.
sleep
(
1
)
checkResult
(
http
lib
.
BAD_GATEWAY
,
body_502_new
)
checkResult
(
http
.
client
.
BAD_GATEWAY
,
body_502_new
)
time
.
sleep
(
max_age
+
2
)
# backend is back to normal, as soon as negative response max-age passed
# the new response is served
checkResult
(
http
lib
.
OK
,
body_200_new
)
checkResult
(
http
.
client
.
OK
,
body_200_new
)
@
skip
(
'Feature postponed'
)
def
test_enable_cache_stale_if_error_respected
(
self
):
...
...
@@ -3978,7 +3989,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
},
source_ip
=
source_ip
)
self
.
assertEqual
(
result
.
status_code
,
http
lib
.
BAD_GATEWAY
)
self
.
assertEqual
(
result
.
status_code
,
http
.
client
.
BAD_GATEWAY
)
finally
:
self
.
startServerProcess
()
# END: check stale-if-error support
...
...
@@ -3996,7 +4007,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
# ATS timed out
self
.
assertEqual
(
http
lib
.
GATEWAY_TIMEOUT
,
http
.
client
.
GATEWAY_TIMEOUT
,
result
.
status_code
)
...
...
@@ -4096,7 +4107,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
j
=
result
.
json
()
except
Exception
:
raise
ValueError
(
'JSON decode problem in:
\
n
%s'
%
(
result
.
text
,))
self
.
assertFalse
(
'pragma'
in
j
[
'Incoming Headers'
].
keys
(
))
self
.
assertFalse
(
'pragma'
in
list
(
j
[
'Incoming Headers'
].
keys
()
))
def
test_enable_cache_disable_via_header
(
self
):
parameter_dict
=
self
.
assertSlaveBase
(
'enable_cache-disable-via-header'
)
...
...
@@ -4315,7 +4326,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
headers
=
{
'Accept-Encoding'
:
'gzip, deflate'
})
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result
.
status_code
)
...
...
@@ -4330,7 +4341,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
headers
=
{
'Accept-Encoding'
:
'deflate'
})
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result
.
status_code
)
...
...
@@ -4344,7 +4355,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
'test-path/deep/.././deeper'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result
.
status_code
)
...
...
@@ -4358,7 +4369,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
'test-path/deep/.././deeper'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result
.
status_code
)
...
...
@@ -4370,32 +4381,34 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
def
test_disabled_cookie_list
(
self
):
parameter_dict
=
self
.
assertSlaveBase
(
'disabled-cookie-list'
)
re
sult
=
fakeHTTPSResul
t
(
parameter_dict
[
'domain'
],
'test-path'
,
# Note: The cookies are always sorted in python2, but not in python3 so
# the order here is important. OrderdedDict won't work, as
# internal implementation of requests will deny it's usage.
# Thus take ultra care with changing anything here or on the
# disabled-cookie-list shared instance parameter ordering, as it
# can easily result with passing of the tests.
# One of the solutions would be to use curl to make this query.
cookies
=
dict
(
Coconut
=
'absent'
,
Coffee
=
'present'
,
Chocolate
=
'absent'
,
Vanilia
=
'absent'
,
)
)
re
placement_dict
=
dic
t
(
domain
=
parameter_dict
[
'domain'
],
ip
=
TEST_IP
,
port
=
HTTPS_PORT
)
curl_command
=
[
'curl'
,
'-v'
,
'-k'
,
'-H'
,
'Host: %(domain)s'
%
replacement_dict
,
'--resolve'
,
'%(domain)s:%(port)s:%(ip)s'
%
replacement_dict
,
'--cookie'
,
# Note: Cookie order is extremely important here, do not change
# or test will start to pass incorrectly
'Coconut=absent; Chocolate=absent; Coffee=present; Vanilia=absent'
,
'https://%(domain)s:%(port)s/'
%
replacement_dict
,
]
prc
=
subprocess
.
Popen
(
curl_command
,
stdout
=
subprocess
.
PIPE
,
stderr
=
subprocess
.
PIPE
)
out
,
err
=
prc
.
communicate
()
self
.
assertEqual
(
self
.
certificate_pem
,
der2pem
(
result
.
peercert
))
self
.
assertEqualResultJson
(
result
,
'Path'
,
'/test-path'
)
self
.
assertBackendHeaders
(
result
.
json
()[
'Incoming Headers'
],
parameter_dict
[
'domain'
])
prc
.
returncode
,
0
,
"Problem running %r. Output:
\
n
%s
\
n
Error:
\
n
%s"
%
(
curl_command
,
out
,
err
))
# self check - were the cookies sent in required order?
self
.
assertIn
(
'ookie: Coconut=absent; Chocolate=absent; Coffee=present; '
'Vanilia=absent'
,
err
.
decode
())
# real test - all configured cookies are dropped
self
.
assertEqual
(
'Coffee=present'
,
result
.
json
(
)[
'Incoming Headers'
][
'cookie'
])
'Coffee=present'
,
json
.
loads
(
out
)[
'Incoming Headers'
][
'cookie'
])
def
test_https_url
(
self
):
parameter_dict
=
self
.
assertSlaveBase
(
'url_https-url'
)
...
...
@@ -4419,7 +4432,7 @@ class TestSlave(SlaveHttpFrontendTestCase, TestDataMixin):
'test-path/deep/.././deeper'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
http
.
client
.
FOUND
,
result_http
.
status_code
)
...
...
@@ -4508,13 +4521,13 @@ class TestReplicateSlave(SlaveHttpFrontendTestCase, TestDataMixin):
'caddy-frontend-2-node-information-json'
]
node_information_json_dict
=
{}
for
k
in
parameter_dict
.
keys
(
):
for
k
in
list
(
parameter_dict
.
keys
()
):
if
k
.
startswith
(
'caddy-frontend'
)
and
k
.
endswith
(
'node-information-json'
):
node_information_json_dict
[
k
]
=
parameter_dict
.
pop
(
k
)
self
.
assertEqual
(
key_list
,
node_information_json_dict
.
keys
(
)
list
(
node_information_json_dict
.
keys
()
)
)
node_information_dict
=
json
.
loads
(
node_information_json_dict
[
key_list
[
0
]])
...
...
@@ -4544,7 +4557,7 @@ class TestReplicateSlave(SlaveHttpFrontendTestCase, TestDataMixin):
result_http
=
fakeHTTPResult
(
parameter_dict
[
'domain'
],
'test-path'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
result_http
.
status_code
)
self
.
assertEqual
(
http
.
client
.
FOUND
,
result_http
.
status_code
)
# prove 2nd frontend by inspection of the instance
slave_configuration_name
=
'_replicate.conf'
...
...
@@ -5117,51 +5130,51 @@ class TestSlaveSlapOSMasterCertificateCompatibility(
'rejected-slave-dict'
:
{
},
'warning-list'
:
[
u
'apache-certificate is obsolete, please use master-key-upload-url'
,
u
'apache-key is obsolete, please use master-key-upload-url'
,
'apache-certificate is obsolete, please use master-key-upload-url'
,
'apache-key is obsolete, please use master-key-upload-url'
,
],
'warning-slave-dict'
:
{
u
'_custom_domain_ssl_crt_ssl_key'
:
[
u
'ssl_crt is obsolete, please use key-upload-url'
,
u
'ssl_key is obsolete, please use key-upload-url'
'_custom_domain_ssl_crt_ssl_key'
:
[
'ssl_crt is obsolete, please use key-upload-url'
,
'ssl_key is obsolete, please use key-upload-url'
],
u
'_custom_domain_ssl_crt_ssl_key_ssl_ca_crt'
:
[
u
'ssl_ca_crt is obsolete, please use key-upload-url'
,
u
'ssl_crt is obsolete, please use key-upload-url'
,
u
'ssl_key is obsolete, please use key-upload-url'
'_custom_domain_ssl_crt_ssl_key_ssl_ca_crt'
:
[
'ssl_ca_crt is obsolete, please use key-upload-url'
,
'ssl_crt is obsolete, please use key-upload-url'
,
'ssl_key is obsolete, please use key-upload-url'
],
u
'_ssl_ca_crt_does_not_match'
:
[
u
'ssl_ca_crt is obsolete, please use key-upload-url'
,
u
'ssl_crt is obsolete, please use key-upload-url'
,
u
'ssl_key is obsolete, please use key-upload-url'
,
'_ssl_ca_crt_does_not_match'
:
[
'ssl_ca_crt is obsolete, please use key-upload-url'
,
'ssl_crt is obsolete, please use key-upload-url'
,
'ssl_key is obsolete, please use key-upload-url'
,
],
u
'_ssl_ca_crt_garbage'
:
[
u
'ssl_ca_crt is obsolete, please use key-upload-url'
,
u
'ssl_crt is obsolete, please use key-upload-url'
,
u
'ssl_key is obsolete, please use key-upload-url'
,
'_ssl_ca_crt_garbage'
:
[
'ssl_ca_crt is obsolete, please use key-upload-url'
,
'ssl_crt is obsolete, please use key-upload-url'
,
'ssl_key is obsolete, please use key-upload-url'
,
],
# u'_ssl_ca_crt_only': [
# u'ssl_ca_crt is obsolete, please use key-upload-url',
# ],
u
'_ssl_from_slave'
:
[
u
'ssl_crt is obsolete, please use key-upload-url'
,
u
'ssl_key is obsolete, please use key-upload-url'
,
'_ssl_from_slave'
:
[
'ssl_crt is obsolete, please use key-upload-url'
,
'ssl_key is obsolete, please use key-upload-url'
,
],
u
'_ssl_from_slave_kedifa_overrides'
:
[
u
'ssl_crt is obsolete, please use key-upload-url'
,
u
'ssl_key is obsolete, please use key-upload-url'
,
'_ssl_from_slave_kedifa_overrides'
:
[
'ssl_crt is obsolete, please use key-upload-url'
,
'ssl_key is obsolete, please use key-upload-url'
,
],
# u'_ssl_key-ssl_crt-unsafe': [
# u'ssl_key is obsolete, please use key-upload-url',
# u'ssl_crt is obsolete, please use key-upload-url',
# ],
u
'_type-notebook-ssl_from_slave'
:
[
u
'ssl_crt is obsolete, please use key-upload-url'
,
u
'ssl_key is obsolete, please use key-upload-url'
,
'_type-notebook-ssl_from_slave'
:
[
'ssl_crt is obsolete, please use key-upload-url'
,
'ssl_key is obsolete, please use key-upload-url'
,
],
u
'_type-notebook-ssl_from_slave_kedifa_overrides'
:
[
u
'ssl_crt is obsolete, please use key-upload-url'
,
u
'ssl_key is obsolete, please use key-upload-url'
,
'_type-notebook-ssl_from_slave_kedifa_overrides'
:
[
'ssl_crt is obsolete, please use key-upload-url'
,
'ssl_key is obsolete, please use key-upload-url'
,
],
}
}
...
...
@@ -5202,7 +5215,7 @@ class TestSlaveSlapOSMasterCertificateCompatibility(
auth
=
requests
.
get
(
self
.
current_generate_auth
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
auth
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
auth
.
status_code
)
data
=
certificate_pem
+
key_pem
...
...
@@ -5210,7 +5223,7 @@ class TestSlaveSlapOSMasterCertificateCompatibility(
self
.
current_upload_url
+
auth
.
text
,
data
=
data
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
upload
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
upload
.
status_code
)
self
.
runKedifaUpdater
()
result
=
fakeHTTPSResult
(
...
...
@@ -5265,7 +5278,7 @@ class TestSlaveSlapOSMasterCertificateCompatibility(
auth
=
requests
.
get
(
self
.
current_generate_auth
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
auth
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
auth
.
status_code
)
data
=
certificate_pem
+
key_pem
...
...
@@ -5273,7 +5286,7 @@ class TestSlaveSlapOSMasterCertificateCompatibility(
self
.
current_upload_url
+
auth
.
text
,
data
=
data
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
upload
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
upload
.
status_code
)
self
.
runKedifaUpdater
()
...
...
@@ -5320,7 +5333,7 @@ class TestSlaveSlapOSMasterCertificateCompatibility(
auth
=
requests
.
get
(
self
.
current_generate_auth
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
auth
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
auth
.
status_code
)
data
=
certificate_pem
+
key_pem
...
...
@@ -5328,7 +5341,7 @@ class TestSlaveSlapOSMasterCertificateCompatibility(
self
.
current_upload_url
+
auth
.
text
,
data
=
data
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
upload
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
upload
.
status_code
)
self
.
runKedifaUpdater
()
...
...
@@ -5387,7 +5400,7 @@ class TestSlaveSlapOSMasterCertificateCompatibility(
auth
=
requests
.
get
(
self
.
current_generate_auth
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
auth
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
auth
.
status_code
)
data
=
certificate_pem
+
key_pem
...
...
@@ -5395,7 +5408,7 @@ class TestSlaveSlapOSMasterCertificateCompatibility(
self
.
current_upload_url
+
auth
.
text
,
data
=
data
,
verify
=
self
.
kedifa_caucase_ca_certificate_file
)
self
.
assertEqual
(
http
lib
.
CREATED
,
upload
.
status_code
)
self
.
assertEqual
(
http
.
client
.
CREATED
,
upload
.
status_code
)
self
.
runKedifaUpdater
()
...
...
@@ -5453,8 +5466,10 @@ class TestSlaveSlapOSMasterCertificateCompatibility(
self
.
assertEqual
(
1
,
len
(
certificate_file_list
))
certificate_file
=
certificate_file_list
[
0
]
with
open
(
certificate_file
)
as
out
:
expected
=
self
.
customdomain_ca_certificate_pem
+
'
\
n
'
+
\
self
.
ca
.
certificate_pem
+
'
\
n
'
+
self
.
customdomain_ca_key_pem
expected
=
\
self
.
customdomain_ca_certificate_pem
.
decode
()
+
'
\
n
'
+
\
self
.
ca
.
certificate_pem
.
decode
()
+
'
\
n
'
+
\
self
.
customdomain_ca_key_pem
.
decode
()
self
.
assertEqual
(
expected
,
out
.
read
()
...
...
@@ -5497,8 +5512,9 @@ class TestSlaveSlapOSMasterCertificateCompatibility(
self
.
assertEqual
(
1
,
len
(
certificate_file_list
))
certificate_file
=
certificate_file_list
[
0
]
with
open
(
certificate_file
)
as
out
:
expected
=
customdomain_ca_certificate_pem
+
'
\
n
'
+
ca
.
certificate_pem
\
+
'
\
n
'
+
customdomain_ca_key_pem
expected
=
customdomain_ca_certificate_pem
.
decode
()
+
'
\
n
'
+
\
ca
.
certificate_pem
.
decode
()
+
'
\
n
'
+
\
customdomain_ca_key_pem
.
decode
()
self
.
assertEqual
(
expected
,
out
.
read
()
...
...
@@ -5548,8 +5564,9 @@ class TestSlaveSlapOSMasterCertificateCompatibility(
self
.
assertEqual
(
1
,
len
(
certificate_file_list
))
certificate_file
=
certificate_file_list
[
0
]
with
open
(
certificate_file
)
as
out
:
expected
=
self
.
certificate_pem
+
'
\
n
'
+
self
.
ca
.
certificate_pem
+
\
'
\
n
'
+
self
.
key_pem
expected
=
self
.
certificate_pem
.
decode
()
+
'
\
n
'
+
\
self
.
ca
.
certificate_pem
.
decode
()
+
'
\
n
'
+
\
self
.
key_pem
.
decode
()
self
.
assertEqual
(
expected
,
out
.
read
()
...
...
@@ -5612,8 +5629,8 @@ class TestSlaveSlapOSMasterCertificateCompatibilityUpdate(
'rejected-slave-dict'
:
{},
'slave-amount'
:
'1'
,
'warning-list'
:
[
u
'apache-certificate is obsolete, please use master-key-upload-url'
,
u
'apache-key is obsolete, please use master-key-upload-url'
,
'apache-certificate is obsolete, please use master-key-upload-url'
,
'apache-key is obsolete, please use master-key-upload-url'
,
],
}
...
...
@@ -5720,11 +5737,11 @@ class TestSlaveCiphers(SlaveHttpFrontendTestCase, TestDataMixin):
self
.
certificate_pem
,
der2pem
(
result
.
peercert
))
self
.
assertEqual
(
http
lib
.
OK
,
result
.
status_code
)
self
.
assertEqual
(
http
.
client
.
OK
,
result
.
status_code
)
result_http
=
fakeHTTPResult
(
parameter_dict
[
'domain'
],
'test-path'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
result_http
.
status_code
)
self
.
assertEqual
(
http
.
client
.
FOUND
,
result_http
.
status_code
)
configuration_file
=
glob
.
glob
(
os
.
path
.
join
(
...
...
@@ -5746,11 +5763,11 @@ class TestSlaveCiphers(SlaveHttpFrontendTestCase, TestDataMixin):
self
.
certificate_pem
,
der2pem
(
result
.
peercert
))
self
.
assertEqual
(
http
lib
.
OK
,
result
.
status_code
)
self
.
assertEqual
(
http
.
client
.
OK
,
result
.
status_code
)
result_http
=
fakeHTTPResult
(
parameter_dict
[
'domain'
],
'test-path'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
result_http
.
status_code
)
self
.
assertEqual
(
http
.
client
.
FOUND
,
result_http
.
status_code
)
configuration_file
=
glob
.
glob
(
os
.
path
.
join
(
...
...
@@ -5945,9 +5962,9 @@ class TestSlaveRejectReportUnsafeDamaged(SlaveHttpFrontendTestCase):
result_json
=
result
.
json
()
self
.
assertEqual
(
{
u'_SITE_4'
:
[
u
"custom_domain 'duplicate.example.com' clashes"
],
u'_SITE_2'
:
[
u
"custom_domain 'duplicate.example.com' clashes"
],
u'_SITE_3'
:
[
u
"server-alias 'duplicate.example.com' clashes"
]
'_SITE_4'
:
[
"custom_domain 'duplicate.example.com' clashes"
],
'_SITE_2'
:
[
"custom_domain 'duplicate.example.com' clashes"
],
'_SITE_3'
:
[
"server-alias 'duplicate.example.com' clashes"
]
},
result_json
)
...
...
@@ -5974,7 +5991,7 @@ class TestSlaveRejectReportUnsafeDamaged(SlaveHttpFrontendTestCase):
'rejected-slave-dict'
:
{
'_HTTPS-URL'
:
[
'slave https-url "https://[fd46::c2ae]:!py!u
\
'
123123
\
'
"'
' invalid'
],
'_URL'
:
[
u
'slave url "https://[fd46::c2ae]:!py!u
\
'
123123
\
'
" invalid'
],
'_URL'
:
[
'slave url "https://[fd46::c2ae]:!py!u
\
'
123123
\
'
" invalid'
],
'_SSL-PROXY-VERIFY_SSL_PROXY_CA_CRT_DAMAGED'
:
[
'ssl_proxy_ca_crt is invalid'
],
...
...
@@ -6214,7 +6231,7 @@ class TestSlaveRejectReportUnsafeDamaged(SlaveHttpFrontendTestCase):
der2pem
(
result
.
peercert
))
self
.
assertEqual
(
http
lib
.
MOVED_PERMANENTLY
,
http
.
client
.
MOVED_PERMANENTLY
,
result
.
status_code
)
...
...
@@ -6234,11 +6251,11 @@ class TestSlaveRejectReportUnsafeDamaged(SlaveHttpFrontendTestCase):
self
.
certificate_pem
,
der2pem
(
result
.
peercert
))
self
.
assertEqual
(
http
lib
.
SERVICE_UNAVAILABLE
,
result
.
status_code
)
self
.
assertEqual
(
http
.
client
.
SERVICE_UNAVAILABLE
,
result
.
status_code
)
result_http
=
fakeHTTPResult
(
parameter_dict
[
'domain'
],
'test-path'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
result_http
.
status_code
)
self
.
assertEqual
(
http
.
client
.
FOUND
,
result_http
.
status_code
)
monitor_file
=
glob
.
glob
(
os
.
path
.
join
(
...
...
@@ -6265,11 +6282,11 @@ class TestSlaveRejectReportUnsafeDamaged(SlaveHttpFrontendTestCase):
self
.
certificate_pem
,
der2pem
(
result
.
peercert
))
self
.
assertEqual
(
http
lib
.
SERVICE_UNAVAILABLE
,
result
.
status_code
)
self
.
assertEqual
(
http
.
client
.
SERVICE_UNAVAILABLE
,
result
.
status_code
)
result_http
=
fakeHTTPResult
(
parameter_dict
[
'domain'
],
'test-path'
)
self
.
assertEqual
(
http
lib
.
FOUND
,
result_http
.
status_code
)
self
.
assertEqual
(
http
.
client
.
FOUND
,
result_http
.
status_code
)
monitor_file
=
glob
.
glob
(
os
.
path
.
join
(
...
...
@@ -6558,98 +6575,98 @@ class TestPassedRequestParameter(HttpFrontendTestCase):
'kedifa'
].
pop
(
'monitor-password'
)
)
backend_client_caucase_url
=
u
'http://[%s]:8990'
%
(
self
.
_ipv6_address
,)
kedifa_caucase_url
=
u
'http://[%s]:15090'
%
(
self
.
_ipv6_address
,)
backend_client_caucase_url
=
'http://[%s]:8990'
%
(
self
.
_ipv6_address
,)
kedifa_caucase_url
=
'http://[%s]:15090'
%
(
self
.
_ipv6_address
,)
expected_partition_parameter_dict_dict
=
{
'caddy-frontend-1'
:
{
'X-software_release_url'
:
base_software_url
,
u'apache-certificate'
:
unicode
(
self
.
certificate_pem
),
u'apache-key'
:
unicode
(
self
.
key_pem
),
u'authenticate-to-backend'
:
u
'True'
,
u
'backend-client-caucase-url'
:
backend_client_caucase_url
,
u'backend-connect-retries'
:
u
'1'
,
u'backend-connect-timeout'
:
u
'2'
,
u'ciphers'
:
u
'ciphers'
,
u'cluster-identification'
:
u
'testing partition 0'
,
u'domain'
:
u
'example.com'
,
u'enable-http2-by-default'
:
u
'True'
,
u'extra_slave_instance_list'
:
u
'[]'
,
u'frontend-name'
:
u
'caddy-frontend-1'
,
u'global-disable-http2'
:
u
'True'
,
u
'kedifa-caucase-url'
:
kedifa_caucase_url
,
u'monitor-cors-domains'
:
u
'monitor.app.officejs.com'
,
u
'monitor-httpd-port'
:
8411
,
u'monitor-username'
:
u
'admin'
,
u'mpm-graceful-shutdown-timeout'
:
u
'2'
,
u
'plain_http_port'
:
'11080'
,
u
'port'
:
'11443'
,
u'ram-cache-size'
:
u
'512K'
,
u're6st-verification-url'
:
u
're6st-verification-url'
,
u'request-timeout'
:
u
'100'
,
u'slave-kedifa-information'
:
u
'{}'
'apache-certificate'
:
self
.
certificate_pem
.
decode
(
),
'apache-key'
:
self
.
key_pem
.
decode
(
),
'authenticate-to-backend'
:
'True'
,
'backend-client-caucase-url'
:
backend_client_caucase_url
,
'backend-connect-retries'
:
'1'
,
'backend-connect-timeout'
:
'2'
,
'ciphers'
:
'ciphers'
,
'cluster-identification'
:
'testing partition 0'
,
'domain'
:
'example.com'
,
'enable-http2-by-default'
:
'True'
,
'extra_slave_instance_list'
:
'[]'
,
'frontend-name'
:
'caddy-frontend-1'
,
'global-disable-http2'
:
'True'
,
'kedifa-caucase-url'
:
kedifa_caucase_url
,
'monitor-cors-domains'
:
'monitor.app.officejs.com'
,
'monitor-httpd-port'
:
8411
,
'monitor-username'
:
'admin'
,
'mpm-graceful-shutdown-timeout'
:
'2'
,
'plain_http_port'
:
'11080'
,
'port'
:
'11443'
,
'ram-cache-size'
:
'512K'
,
're6st-verification-url'
:
're6st-verification-url'
,
'request-timeout'
:
'100'
,
'slave-kedifa-information'
:
'{}'
},
'caddy-frontend-2'
:
{
'X-software_release_url'
:
self
.
frontend_2_sr
,
u'apache-certificate'
:
unicode
(
self
.
certificate_pem
),
u'apache-key'
:
unicode
(
self
.
key_pem
),
u'authenticate-to-backend'
:
u
'True'
,
u
'backend-client-caucase-url'
:
backend_client_caucase_url
,
u'backend-connect-retries'
:
u
'1'
,
u'backend-connect-timeout'
:
u
'2'
,
u'ciphers'
:
u
'ciphers'
,
u'cluster-identification'
:
u
'testing partition 0'
,
u'domain'
:
u
'example.com'
,
u'enable-http2-by-default'
:
u
'True'
,
u'extra_slave_instance_list'
:
u
'[]'
,
u'frontend-name'
:
u
'caddy-frontend-2'
,
u'global-disable-http2'
:
u
'True'
,
u
'kedifa-caucase-url'
:
kedifa_caucase_url
,
u'monitor-cors-domains'
:
u
'monitor.app.officejs.com'
,
u
'monitor-httpd-port'
:
8412
,
u'monitor-username'
:
u
'admin'
,
u'mpm-graceful-shutdown-timeout'
:
u
'2'
,
u'plain_http_port'
:
u
'11080'
,
u'port'
:
u
'11443'
,
u'ram-cache-size'
:
u
'256K'
,
u're6st-verification-url'
:
u
're6st-verification-url'
,
u'request-timeout'
:
u
'100'
,
u'slave-kedifa-information'
:
u
'{}'
'apache-certificate'
:
self
.
certificate_pem
.
decode
(
),
'apache-key'
:
self
.
key_pem
.
decode
(
),
'authenticate-to-backend'
:
'True'
,
'backend-client-caucase-url'
:
backend_client_caucase_url
,
'backend-connect-retries'
:
'1'
,
'backend-connect-timeout'
:
'2'
,
'ciphers'
:
'ciphers'
,
'cluster-identification'
:
'testing partition 0'
,
'domain'
:
'example.com'
,
'enable-http2-by-default'
:
'True'
,
'extra_slave_instance_list'
:
'[]'
,
'frontend-name'
:
'caddy-frontend-2'
,
'global-disable-http2'
:
'True'
,
'kedifa-caucase-url'
:
kedifa_caucase_url
,
'monitor-cors-domains'
:
'monitor.app.officejs.com'
,
'monitor-httpd-port'
:
8412
,
'monitor-username'
:
'admin'
,
'mpm-graceful-shutdown-timeout'
:
'2'
,
'plain_http_port'
:
'11080'
,
'port'
:
'11443'
,
'ram-cache-size'
:
'256K'
,
're6st-verification-url'
:
're6st-verification-url'
,
'request-timeout'
:
'100'
,
'slave-kedifa-information'
:
'{}'
},
'caddy-frontend-3'
:
{
'X-software_release_url'
:
self
.
frontend_3_sr
,
u'apache-certificate'
:
unicode
(
self
.
certificate_pem
),
u'apache-key'
:
unicode
(
self
.
key_pem
),
u'authenticate-to-backend'
:
u
'True'
,
u
'backend-client-caucase-url'
:
backend_client_caucase_url
,
u'backend-connect-retries'
:
u
'1'
,
u'backend-connect-timeout'
:
u
'2'
,
u'ciphers'
:
u
'ciphers'
,
u'cluster-identification'
:
u
'testing partition 0'
,
u'domain'
:
u
'example.com'
,
u'enable-http2-by-default'
:
u
'True'
,
u'extra_slave_instance_list'
:
u
'[]'
,
u'frontend-name'
:
u
'caddy-frontend-3'
,
u'global-disable-http2'
:
u
'True'
,
u
'kedifa-caucase-url'
:
kedifa_caucase_url
,
u'monitor-cors-domains'
:
u
'monitor.app.officejs.com'
,
u
'monitor-httpd-port'
:
8413
,
u'monitor-username'
:
u
'admin'
,
u'mpm-graceful-shutdown-timeout'
:
u
'2'
,
u'plain_http_port'
:
u
'11080'
,
u'port'
:
u
'11443'
,
u're6st-verification-url'
:
u
're6st-verification-url'
,
u'request-timeout'
:
u
'100'
,
u'slave-kedifa-information'
:
u
'{}'
'apache-certificate'
:
self
.
certificate_pem
.
decode
(
),
'apache-key'
:
self
.
key_pem
.
decode
(
),
'authenticate-to-backend'
:
'True'
,
'backend-client-caucase-url'
:
backend_client_caucase_url
,
'backend-connect-retries'
:
'1'
,
'backend-connect-timeout'
:
'2'
,
'ciphers'
:
'ciphers'
,
'cluster-identification'
:
'testing partition 0'
,
'domain'
:
'example.com'
,
'enable-http2-by-default'
:
'True'
,
'extra_slave_instance_list'
:
'[]'
,
'frontend-name'
:
'caddy-frontend-3'
,
'global-disable-http2'
:
'True'
,
'kedifa-caucase-url'
:
kedifa_caucase_url
,
'monitor-cors-domains'
:
'monitor.app.officejs.com'
,
'monitor-httpd-port'
:
8413
,
'monitor-username'
:
'admin'
,
'mpm-graceful-shutdown-timeout'
:
'2'
,
'plain_http_port'
:
'11080'
,
'port'
:
'11443'
,
're6st-verification-url'
:
're6st-verification-url'
,
'request-timeout'
:
'100'
,
'slave-kedifa-information'
:
'{}'
},
'kedifa'
:
{
'X-software_release_url'
:
self
.
kedifa_sr
,
u'caucase_port'
:
u
'15090'
,
u'cluster-identification'
:
u
'testing partition 0'
,
u'kedifa_port'
:
u
'15080'
,
u'monitor-cors-domains'
:
u
'monitor.app.officejs.com'
,
u'monitor-httpd-port'
:
u
'8402'
,
u'monitor-username'
:
u
'admin'
,
u
'slave-list'
:
[]
'caucase_port'
:
'15090'
,
'cluster-identification'
:
'testing partition 0'
,
'kedifa_port'
:
'15080'
,
'monitor-cors-domains'
:
'monitor.app.officejs.com'
,
'monitor-httpd-port'
:
'8402'
,
'monitor-username'
:
'admin'
,
'slave-list'
:
[]
},
'testing partition 0'
:
{
'-frontend-2-software-release-url'
:
self
.
frontend_2_sr
,
...
...
@@ -6663,8 +6680,8 @@ class TestPassedRequestParameter(HttpFrontendTestCase):
'-sla-2-computer_guid'
:
'local'
,
'-sla-3-computer_guid'
:
'local'
,
'X-software_release_url'
:
base_software_url
,
'apache-certificate'
:
unicode
(
self
.
certificate_pem
),
'apache-key'
:
unicode
(
self
.
key_pem
),
'apache-certificate'
:
self
.
certificate_pem
.
decode
(
),
'apache-key'
:
self
.
key_pem
.
decode
(
),
'authenticate-to-backend'
:
'True'
,
'automatic-internal-backend-client-caucase-csr'
:
'False'
,
'automatic-internal-kedifa-caucase-csr'
:
'False'
,
...
...
@@ -6819,7 +6836,7 @@ class TestSlaveHealthCheck(SlaveHttpFrontendTestCase, TestDataMixin):
@
classmethod
def
setUpAssertionDict
(
cls
):
backend
=
urlparse
.
urlparse
(
cls
.
backend_url
).
netloc
backend
=
url
lib
.
parse
.
urlparse
(
cls
.
backend_url
).
netloc
cls
.
assertion_dict
=
{
'health-check-disabled'
:
"""
\
backend _health-check-disabled-http
...
...
@@ -6904,14 +6921,14 @@ backend _health-check-default-http
self
.
backend_url
+
slave_parameter_dict
[
'health-check-http-path'
].
strip
(
'/'
),
headers
=
{
'X-Reply-Status-Code'
:
'502'
})
self
.
assertEqual
(
result
.
status_code
,
http
lib
.
CREATED
)
self
.
assertEqual
(
result
.
status_code
,
http
.
client
.
CREATED
)
def
restoreBackend
():
result
=
requests
.
put
(
self
.
backend_url
+
slave_parameter_dict
[
'health-check-http-path'
].
strip
(
'/'
),
headers
=
{})
self
.
assertEqual
(
result
.
status_code
,
http
lib
.
CREATED
)
self
.
assertEqual
(
result
.
status_code
,
http
.
client
.
CREATED
)
self
.
addCleanup
(
restoreBackend
)
time
.
sleep
(
3
)
# > health-check-timeout + health-check-interval
...
...
@@ -6961,15 +6978,15 @@ backend _health-check-default-http
self.backend_url + slave_parameter_dict[
'
health
-
check
-
http
-
path
'].strip('
/
'),
headers={'
X
-
Reply
-
Status
-
Code
': '
502
'})
self.assertEqual(result.status_code, http
lib
.CREATED)
self.assertEqual(result.status_code, http
lib
.CREATED)
self.assertEqual(result.status_code, http
.client
.CREATED)
self.assertEqual(result.status_code, http
.client
.CREATED)
def restoreBackend():
result = requests.put(
self.backend_url + slave_parameter_dict[
'
health
-
check
-
http
-
path
'].strip('
/
'),
headers={})
self.assertEqual(result.status_code, http
lib
.CREATED)
self.assertEqual(result.status_code, http
.client
.CREATED)
self.addCleanup(restoreBackend)
time.sleep(3) # > health-check-timeout + health-check-interval
...
...
@@ -7011,7 +7028,7 @@ backend _health-check-default-http
self
.
backend_url
+
slave_parameter_dict
[
'health-check-http-path'
].
strip
(
'/'
),
headers
=
{
'X-Reply-Status-Code'
:
'502'
})
self
.
assertEqual
(
result
.
status_code
,
http
lib
.
CREATED
)
self
.
assertEqual
(
result
.
status_code
,
http
.
client
.
CREATED
)
time
.
sleep
(
3
)
# > health-check-timeout + health-check-interval
...
...
@@ -7044,7 +7061,7 @@ backend _health-check-default-http
self
.
backend_url
+
slave_parameter_dict
[
'health-check-http-path'
].
strip
(
'/'
),
headers
=
{
'X-Reply-Status-Code'
:
'502'
})
self
.
assertEqual
(
result
.
status_code
,
http
lib
.
CREATED
)
self
.
assertEqual
(
result
.
status_code
,
http
.
client
.
CREATED
)
time
.
sleep
(
3
)
# > health-check-timeout + health-check-interval
...
...
@@ -7073,7 +7090,7 @@ backend _health-check-default-http
self
.
backend_url
+
slave_parameter_dict
[
'health-check-http-path'
].
strip
(
'/'
),
headers
=
{
'X-Reply-Status-Code'
:
'502'
})
self
.
assertEqual
(
result
.
status_code
,
http
lib
.
CREATED
)
self
.
assertEqual
(
result
.
status_code
,
http
.
client
.
CREATED
)
time
.
sleep
(
3
)
# > health-check-timeout + health-check-interval
...
...
@@ -7085,7 +7102,7 @@ backend _health-check-default-http
der2pem
(
result
.
peercert
))
# as ssl proxy verification failed, service is unavailable
self
.
assertEqual
(
result
.
status_code
,
http
lib
.
SERVICE_UNAVAILABLE
)
self
.
assertEqual
(
result
.
status_code
,
http
.
client
.
SERVICE_UNAVAILABLE
)
def
test_health_check_failover_url_ssl_proxy_missing
(
self
):
parameter_dict
=
self
.
assertSlaveBase
(
...
...
@@ -7103,7 +7120,7 @@ backend _health-check-default-http
self
.
backend_url
+
slave_parameter_dict
[
'health-check-http-path'
].
strip
(
'/'
),
headers
=
{
'X-Reply-Status-Code'
:
'502'
})
self
.
assertEqual
(
result
.
status_code
,
http
lib
.
CREATED
)
self
.
assertEqual
(
result
.
status_code
,
http
.
client
.
CREATED
)
time
.
sleep
(
3
)
# > health-check-timeout + health-check-interval
...
...
@@ -7115,7 +7132,7 @@ backend _health-check-default-http
der2pem
(
result
.
peercert
))
# as ssl proxy verification failed, service is unavailable
self
.
assertEqual
(
result
.
status_code
,
http
lib
.
SERVICE_UNAVAILABLE
)
self
.
assertEqual
(
result
.
status_code
,
http
.
client
.
SERVICE_UNAVAILABLE
)
if
__name__
==
'__main__'
:
...
...
@@ -7130,5 +7147,5 @@ if __name__ == '__main__':
url_template
=
'http://%s:%s/'
server
=
klass
((
ip
,
port
),
TestHandler
)
print
url_template
%
server
.
server_address
[:
2
]
print
((
url_template
%
server
.
server_address
[:
2
]))
server
.
serve_forever
()
software/slapos-sr-testing/software-py3.cfg
View file @
60918e63
...
...
@@ -13,6 +13,7 @@ extra-eggs +=
[template]
extra =
# The following list is for SR whose buildout runs only with Python 3.
caddy-frontend ${slapos.test.caddy-frontend-setup:setup}
erp5testnode ${slapos.test.erp5testnode-setup:setup}
galene ${slapos.test.galene-setup:setup}
headless-chromium ${slapos.test.headless-chromium-setup:setup}
...
...
software/slapos-sr-testing/software.cfg
View file @
60918e63
...
...
@@ -359,7 +359,6 @@ extra =
# You should not add more lines here.
backupserver ${slapos.test.backupserver-setup:setup}
beremiz-ide ${slapos.test.beremiz-ide-setup:setup}
caddy-frontend ${slapos.test.caddy-frontend-setup:setup}
caucase ${slapos.test.caucase-setup:setup}
cloudooo ${slapos.test.cloudooo-setup:setup}
dream ${slapos.test.dream-setup:setup}
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment