Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
S
slapos.buildout
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
Analytics
Analytics
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Commits
Issue Boards
Open sidebar
Xavier Thompson
slapos.buildout
Commits
3b16f5af
Commit
3b16f5af
authored
May 07, 2024
by
Xavier Thompson
Browse files
Options
Browse Files
Download
Plain Diff
zc.buildout 3.0.1+slapos001: Rebase on zc.buildout 3.0.1
See merge request
!30
parents
ac3f5e4c
d8f72f75
Changes
29
Show whitespace changes
Inline
Side-by-side
Showing
29 changed files
with
2516 additions
and
832 deletions
+2516
-832
doc/getting-started.rst
doc/getting-started.rst
+34
-3
doc/reference.rst
doc/reference.rst
+13
-0
setup.py
setup.py
+2
-2
src/zc/buildout/__init__.py
src/zc/buildout/__init__.py
+13
-0
src/zc/buildout/buildout.py
src/zc/buildout/buildout.py
+745
-373
src/zc/buildout/configparser.py
src/zc/buildout/configparser.py
+12
-3
src/zc/buildout/download.py
src/zc/buildout/download.py
+154
-72
src/zc/buildout/easy_install.py
src/zc/buildout/easy_install.py
+321
-255
src/zc/buildout/rmtree.py
src/zc/buildout/rmtree.py
+79
-13
src/zc/buildout/testing.py
src/zc/buildout/testing.py
+23
-0
src/zc/buildout/tests/__init__.py
src/zc/buildout/tests/__init__.py
+39
-0
src/zc/buildout/tests/allow-unknown-extras.txt
src/zc/buildout/tests/allow-unknown-extras.txt
+1
-0
src/zc/buildout/tests/allowhosts.txt
src/zc/buildout/tests/allowhosts.txt
+4
-0
src/zc/buildout/tests/buildout.txt
src/zc/buildout/tests/buildout.txt
+117
-6
src/zc/buildout/tests/dependencylinks.txt
src/zc/buildout/tests/dependencylinks.txt
+2
-0
src/zc/buildout/tests/download.txt
src/zc/buildout/tests/download.txt
+76
-15
src/zc/buildout/tests/downloadcache.txt
src/zc/buildout/tests/downloadcache.txt
+9
-6
src/zc/buildout/tests/easy_install.txt
src/zc/buildout/tests/easy_install.txt
+22
-16
src/zc/buildout/tests/extends-cache.txt.disabled
src/zc/buildout/tests/extends-cache.txt.disabled
+2
-2
src/zc/buildout/tests/repeatable.txt
src/zc/buildout/tests/repeatable.txt
+1
-0
src/zc/buildout/tests/test_all.py
src/zc/buildout/tests/test_all.py
+533
-21
zc.recipe.egg_/setup.py
zc.recipe.egg_/setup.py
+1
-1
zc.recipe.egg_/src/zc/recipe/egg/README.rst
zc.recipe.egg_/src/zc/recipe/egg/README.rst
+13
-0
zc.recipe.egg_/src/zc/recipe/egg/api.rst
zc.recipe.egg_/src/zc/recipe/egg/api.rst
+2
-2
zc.recipe.egg_/src/zc/recipe/egg/custom.py
zc.recipe.egg_/src/zc/recipe/egg/custom.py
+89
-39
zc.recipe.egg_/src/zc/recipe/egg/custom.rst
zc.recipe.egg_/src/zc/recipe/egg/custom.rst
+26
-1
zc.recipe.egg_/src/zc/recipe/egg/egg.py
zc.recipe.egg_/src/zc/recipe/egg/egg.py
+39
-2
zc.recipe.egg_/src/zc/recipe/egg/patches.rst
zc.recipe.egg_/src/zc/recipe/egg/patches.rst
+124
-0
zc.recipe.egg_/src/zc/recipe/egg/tests.py
zc.recipe.egg_/src/zc/recipe/egg/tests.py
+20
-0
No files found.
doc/getting-started.rst
View file @
3b16f5af
...
@@ -28,11 +28,15 @@ The recommended way to install Buildout is to use pip within a virtual environme
...
@@ -28,11 +28,15 @@ The recommended way to install Buildout is to use pip within a virtual environme
..
code
-
block
::
console
..
code
-
block
::
console
virtualenv
mybuildout
python3
-
m
venv
myenv
cd
mybuildout
source
myenv
/
bin
/
activate
bin
/
pip
install
zc
.
buildout
pip
install
zc
.
buildout
Or
for
the
code
from
master
branch
:
..
code
-
block
::
console
pip
install
https
://
lab
.
nexedi
.
com
/
nexedi
/
slapos
.
buildout
/-/
archive
/
master
/
slapos
.
buildout
-
master
.
tar
.
gz
To
use
Buildout
,
you
need
to
provide
a
Buildout
configuration
.
Here
is
To
use
Buildout
,
you
need
to
provide
a
Buildout
configuration
.
Here
is
a
minimal
configuration
:
a
minimal
configuration
:
...
@@ -98,6 +102,33 @@ specified using *parts*. The parts to be built are listed in the
...
@@ -98,6 +102,33 @@ specified using *parts*. The parts to be built are listed in the
name
that
specifies
the
software
to
build
the
part
and
provides
name
that
specifies
the
software
to
build
the
part
and
provides
parameters
to
control
how
the
part
is
built
.
parameters
to
control
how
the
part
is
built
.
Bootstrapping
an
isolated
environment
=====================================
Sometimes
it
is
useful
to
install
``
zc
.
buildout
``
and
its
dependencies
directly
in
``
eggs
``
directory
and
to
generate
a
``
buildout
``
script
in
the
``
bin
``
directory
that
uses
the
version
in
``
eggs
``
directory
,
instead
of
relying
on
the
package
available
in
the
environment
.
One
way
to
achieve
this
uses
the
``
extra
-
paths
``
option
of
``
buildout
``
section
:
by
setting
it
to
empty
value
,
packages
outside
of
``
eggs
``
or
``
develop
-
eggs
``
directories
will
not
be
considered
when
looking
for
already
installed
eggs
.
Then
the
``
bootstrap
``
command
will
install
``
zc
.
buildout
``
and
its
dependencies
from
scratch
in
``
eggs
``.
..
code
-
block
::
console
buildout
buildout
:
extra
-
paths
=
bootstrap
After
this
,
the
generated
``
bin
/
buildout
``
script
will
use
the
packages
installed
in
``
eggs
``
directory
instead
of
those
in
the
environment
and
preserve
the
isolation
from
the
environment
,
even
without
setting
``
extra
-
paths
``.
That
is
because
the
default
value
for
``
extra
-
paths
``
only
considers
the
paths
where
``
zc
.
buildout
``
and
its
dependencies
are
found
,
and
in
this
case
that
is
only
the
``
eggs
``
directory
.
Installing
software
Installing
software
===================
===================
...
...
doc/reference.rst
View file @
3b16f5af
...
@@ -358,6 +358,19 @@ extends-cache
...
@@ -358,6 +358,19 @@ extends-cache
substitutions, and the result is a relative path, then it will be
substitutions, and the result is a relative path, then it will be
interpreted relative to the buildout directory.)
interpreted relative to the buildout directory.)
.. _extra-paths-buildout-option
extra-paths, default: '
zc
.
buildout
'
Extra paths to scan for already installed distributions.
Setting this to an empty value enables isolation of buildout.
Setting this to '
legacy
' enables the legacy behavior of
scanning the paths of the distributions of zc.buildout itself
and its dependencies, which may contain sites-packages or not.
Setting this to '
zc
.
buildout
' also scans the paths of the
current zc.buildout and dependencies, but respects the order
they appear in sys.path, avoiding unexpected results.
.. _find-links-option:
.. _find-links-option:
find-links, default: ''
find-links, default: ''
...
...
setup.py
View file @
3b16f5af
...
@@ -12,7 +12,7 @@
...
@@ -12,7 +12,7 @@
#
#
##############################################################################
##############################################################################
name
=
"zc.buildout"
name
=
"zc.buildout"
version
=
'3.0.1'
version
=
'3.0.1
+slapos001
'
import
os
import
os
from
setuptools
import
setup
from
setuptools
import
setup
...
@@ -47,7 +47,7 @@ setup(
...
@@ -47,7 +47,7 @@ setup(
python_requires
=
'>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*'
,
python_requires
=
'>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*'
,
namespace_packages
=
[
'zc'
],
namespace_packages
=
[
'zc'
],
install_requires
=
[
install_requires
=
[
'setuptools>=
8.0
'
,
'setuptools>=
38.2.3
'
,
'pip'
,
'pip'
,
'wheel'
,
'wheel'
,
],
],
...
...
src/zc/buildout/__init__.py
View file @
3b16f5af
...
@@ -49,3 +49,16 @@ class UserError(Exception):
...
@@ -49,3 +49,16 @@ class UserError(Exception):
def
__str__
(
self
):
def
__str__
(
self
):
return
" "
.
join
(
map
(
str
,
self
.
args
))
return
" "
.
join
(
map
(
str
,
self
.
args
))
# Used for Python 2-3 compatibility
if
str
is
bytes
:
# BBB Py2
bytes2str
=
str2bytes
=
lambda
s
:
s
def
unicode2str
(
s
):
return
s
.
encode
(
'utf-8'
)
else
:
def
bytes2str
(
s
):
return
s
.
decode
()
def
str2bytes
(
s
):
return
s
.
encode
()
def
unicode2str
(
s
):
return
s
src/zc/buildout/buildout.py
View file @
3b16f5af
...
@@ -29,10 +29,21 @@ try:
...
@@ -29,10 +29,21 @@ try:
except
ImportError
:
except
ImportError
:
from
UserDict
import
DictMixin
from
UserDict
import
DictMixin
try
:
from
cStringIO
import
StringIO
except
ImportError
:
from
io
import
StringIO
try
:
from
urllib.parse
import
urljoin
except
ImportError
:
# BBB Py2
from
urlparse
import
urljoin
import
zc.buildout.configparser
import
zc.buildout.configparser
import
copy
import
copy
import
datetime
import
datetime
import
distutils.errors
import
distutils.errors
import
errno
import
glob
import
glob
import
importlib
import
importlib
import
inspect
import
inspect
...
@@ -45,8 +56,10 @@ import shutil
...
@@ -45,8 +56,10 @@ import shutil
import
subprocess
import
subprocess
import
sys
import
sys
import
tempfile
import
tempfile
import
pprint
import
zc.buildout
import
zc.buildout
import
zc.buildout.download
import
zc.buildout.download
from
functools
import
partial
PY3
=
sys
.
version_info
[
0
]
==
3
PY3
=
sys
.
version_info
[
0
]
==
3
if
PY3
:
if
PY3
:
...
@@ -82,6 +95,66 @@ def print_(*args, **kw):
...
@@ -82,6 +95,66 @@ def print_(*args, **kw):
file
=
sys
.
stdout
file
=
sys
.
stdout
file
.
write
(
sep
.
join
(
map
(
str
,
args
))
+
end
)
file
.
write
(
sep
.
join
(
map
(
str
,
args
))
+
end
)
_MARKER
=
[]
class
BuildoutSerialiser
(
object
):
# XXX: I would like to access pprint._safe_repr, but it's not
# officially available. PrettyPrinter class has a functionally-speaking
# static method "format" which just calls _safe_repr, but it is not
# declared as static... So I must create an instance of it.
_format
=
pprint
.
PrettyPrinter
().
format
_dollar
=
'
\
\
x%02x'
%
ord
(
'$'
)
_semicolon
=
'
\
\
x%02x'
%
ord
(
';'
)
_safe_globals
=
{
'__builtins__'
:
{
# Types which are represented as calls to their constructor.
'bytearray'
:
bytearray
,
'complex'
:
complex
,
'frozenset'
:
frozenset
,
'set'
:
set
,
# Those buildins are available through keywords, which allow creating
# instances which in turn give back access to classes. So no point in
# hiding them.
'dict'
:
dict
,
'list'
:
list
,
'str'
:
str
,
'tuple'
:
tuple
,
'False'
:
False
,
'True'
:
True
,
'None'
:
None
,
}}
def
loads
(
self
,
value
):
return
eval
(
value
,
self
.
_safe_globals
)
def
dumps
(
self
,
value
):
value
,
isreadable
,
_
=
self
.
_format
(
value
,
{},
0
,
0
)
if
not
isreadable
:
raise
ValueError
(
'Value cannot be serialised: %s'
%
(
value
,
))
return
value
.
replace
(
'$'
,
self
.
_dollar
).
replace
(
';'
,
self
.
_semicolon
)
SERIALISED_VALUE_MAGIC
=
'!py'
SERIALISED
=
re
.
compile
(
SERIALISED_VALUE_MAGIC
+
'([^!]*)!(.*)'
)
SERIALISER_REGISTRY
=
{
''
:
BuildoutSerialiser
(),
}
SERIALISER_VERSION
=
''
SERIALISER
=
SERIALISER_REGISTRY
[
SERIALISER_VERSION
]
# Used only to compose data
SERIALISER_PREFIX
=
SERIALISED_VALUE_MAGIC
+
SERIALISER_VERSION
+
'!'
assert
SERIALISED
.
match
(
SERIALISER_PREFIX
).
groups
()
==
(
SERIALISER_VERSION
,
''
),
SERIALISED
.
match
(
SERIALISER_PREFIX
).
groups
()
def
dumps
(
value
):
orig_value
=
value
value
=
SERIALISER
.
dumps
(
value
)
assert
SERIALISER
.
loads
(
value
)
==
orig_value
,
(
repr
(
value
),
orig_value
)
return
SERIALISER_PREFIX
+
value
def
loads
(
value
):
assert
value
.
startswith
(
SERIALISED_VALUE_MAGIC
),
repr
(
value
)
version
,
data
=
SERIALISED
.
match
(
value
).
groups
()
return
SERIALISER_REGISTRY
[
version
].
loads
(
data
)
realpath
=
zc
.
buildout
.
easy_install
.
realpath
realpath
=
zc
.
buildout
.
easy_install
.
realpath
_isurl
=
re
.
compile
(
'([a-zA-Z0-9+.-]+)://'
).
match
_isurl
=
re
.
compile
(
'([a-zA-Z0-9+.-]+)://'
).
match
...
@@ -259,13 +332,22 @@ def _print_annotate(data, verbose, chosen_sections, basedir):
...
@@ -259,13 +332,22 @@ def _print_annotate(data, verbose, chosen_sections, basedir):
sectionkey
=
data
[
section
][
key
]
sectionkey
=
data
[
section
][
key
]
sectionkey
.
printAll
(
key
,
basedir
,
verbose
)
sectionkey
.
printAll
(
key
,
basedir
,
verbose
)
def
_remove_ignore_missing
(
path
):
try
:
os
.
remove
(
path
)
except
OSError
as
e
:
if
e
.
errno
!=
errno
.
ENOENT
:
raise
def
_unannotate_section
(
section
):
def
_unannotate_section
(
section
):
return
{
key
:
entry
.
value
for
key
,
entry
in
section
.
items
()}
for
key
in
section
:
section
[
key
]
=
section
[
key
].
value
return
section
def
_unannotate
(
data
):
def
_unannotate
(
data
):
return
{
key
:
_unannotate_section
(
section
)
for
key
,
section
in
data
.
items
()}
for
key
in
data
:
_unannotate_section
(
data
[
key
])
return
data
def
_format_picked_versions
(
picked_versions
,
required_by
):
def
_format_picked_versions
(
picked_versions
,
required_by
):
...
@@ -292,6 +374,7 @@ _buildout_default_options = _annotate_section({
...
@@ -292,6 +374,7 @@ _buildout_default_options = _annotate_section({
'develop-eggs-directory'
:
'develop-eggs'
,
'develop-eggs-directory'
:
'develop-eggs'
,
'eggs-directory'
:
'eggs'
,
'eggs-directory'
:
'eggs'
,
'executable'
:
sys
.
executable
,
'executable'
:
sys
.
executable
,
'extra-paths'
:
'zc.buildout'
,
'find-links'
:
''
,
'find-links'
:
''
,
'install-from-cache'
:
'false'
,
'install-from-cache'
:
'false'
,
'installed'
:
'.installed.cfg'
,
'installed'
:
'.installed.cfg'
,
...
@@ -299,6 +382,7 @@ _buildout_default_options = _annotate_section({
...
@@ -299,6 +382,7 @@ _buildout_default_options = _annotate_section({
'log-level'
:
'INFO'
,
'log-level'
:
'INFO'
,
'newest'
:
'true'
,
'newest'
:
'true'
,
'offline'
:
'false'
,
'offline'
:
'false'
,
'dry-run'
:
'false'
,
'parts-directory'
:
'parts'
,
'parts-directory'
:
'parts'
,
'prefer-final'
:
'true'
,
'prefer-final'
:
'true'
,
'python'
:
'buildout'
,
'python'
:
'buildout'
,
...
@@ -316,11 +400,16 @@ def _get_user_config():
...
@@ -316,11 +400,16 @@ def _get_user_config():
return
os
.
path
.
join
(
buildout_home
,
'default.cfg'
)
return
os
.
path
.
join
(
buildout_home
,
'default.cfg'
)
networkcache_client
=
None
@
commands
@
commands
class
Buildout
(
DictMixin
):
class
Buildout
(
DictMixin
):
COMMANDS
=
set
()
COMMANDS
=
set
()
installed_part_options
=
None
def
__init__
(
self
,
config_file
,
cloptions
,
def
__init__
(
self
,
config_file
,
cloptions
,
use_user_defaults
=
True
,
use_user_defaults
=
True
,
command
=
None
,
args
=
()):
command
=
None
,
args
=
()):
...
@@ -355,54 +444,24 @@ class Buildout(DictMixin):
...
@@ -355,54 +444,24 @@ class Buildout(DictMixin):
data
[
'buildout'
][
'directory'
]
=
SectionKey
(
data
[
'buildout'
][
'directory'
]
=
SectionKey
(
os
.
path
.
dirname
(
config_file
),
'COMPUTED_VALUE'
)
os
.
path
.
dirname
(
config_file
),
'COMPUTED_VALUE'
)
cloptions
=
dict
(
result
=
{}
(
section
,
dict
((
option
,
SectionKey
(
value
,
'COMMAND_LINE_VALUE'
))
for
section
,
option
,
value
in
cloptions
:
for
(
_
,
option
,
value
)
in
v
))
result
.
setdefault
(
section
,
{})[
option
]
=
value
for
(
section
,
v
)
in
itertools
.
groupby
(
sorted
(
cloptions
),
lambda
v
:
v
[
0
])
)
override
=
copy
.
deepcopy
(
cloptions
.
get
(
'buildout'
,
{}))
# load user defaults, which override defaults
options
=
result
.
setdefault
(
'buildout'
,
{})
extends
=
[]
user_config
=
_get_user_config
()
user_config
=
_get_user_config
()
if
use_user_defaults
and
os
.
path
.
exists
(
user_config
):
if
use_user_defaults
and
os
.
path
.
exists
(
user_config
):
download_options
=
data
[
'buildout'
]
extends
.
append
(
user_config
)
user_defaults
,
_
=
_open
(
os
.
path
.
dirname
(
user_config
),
user_config
,
[],
download_options
,
override
,
set
(),
{}
)
for_download_options
=
_update
(
data
,
user_defaults
)
else
:
user_defaults
=
{}
for_download_options
=
copy
.
deepcopy
(
data
)
# load configuration files
if
config_file
:
if
config_file
:
download_options
=
for_download_options
[
'buildout'
]
extends
.
append
(
config_file
)
cfg_data
,
_
=
_open
(
clextends
=
options
.
get
(
'extends'
)
os
.
path
.
dirname
(
config_file
),
if
clextends
:
config_file
,
[],
download_options
,
extends
.
append
(
clextends
)
override
,
set
(),
user_defaults
options
[
'extends'
]
=
'
\
n
'
.
join
(
extends
)
)
data
=
_update
(
data
,
cfg_data
)
# extends from command-line
if
'buildout'
in
cloptions
:
cl_extends
=
cloptions
[
'buildout'
].
pop
(
'extends'
,
None
)
if
cl_extends
:
for
extends
in
cl_extends
.
value
.
split
():
download_options
=
for_download_options
[
'buildout'
]
cfg_data
,
_
=
_open
(
os
.
path
.
dirname
(
extends
),
os
.
path
.
basename
(
extends
),
[],
download_options
,
override
,
set
(),
user_defaults
)
data
=
_update
(
data
,
cfg_data
)
# apply command-line options
data
=
_extends
(
data
,
result
,
os
.
getcwd
(),
'COMMAND_LINE_VALUE'
)
data
=
_update
(
data
,
cloptions
)
# Set up versions section, if necessary
# Set up versions section, if necessary
if
'versions'
not
in
data
[
'buildout'
]:
if
'versions'
not
in
data
[
'buildout'
]:
...
@@ -418,15 +477,17 @@ class Buildout(DictMixin):
...
@@ -418,15 +477,17 @@ class Buildout(DictMixin):
else
:
else
:
versions
=
{}
versions
=
{}
versions
.
update
(
versions
.
update
(
dict
((
k
,
SectionKey
(
v
,
'DEFAULT_VALUE'
))
dict
((
k
,
SectionKey
(
v
(),
'DEFAULT_VALUE'
))
# Use lambdas to compute values only if needed
for
(
k
,
v
)
in
(
for
(
k
,
v
)
in
(
# Prevent downgrading due to prefer-final:
# Prevent downgrading due to prefer-final:
(
'zc.buildout'
,
(
'zc.buildout'
,
'>='
+
pkg_resources
.
working_set
.
find
(
lambda
:
'>='
+
pkg_resources
.
working_set
.
find
(
pkg_resources
.
Requirement
.
parse
(
'zc.buildout'
)
pkg_resources
.
Requirement
.
parse
(
'zc.buildout'
)
).
version
),
# Skip local part because ">=x.y.z+abc" is invalid
).
parsed_version
.
public
),
# Use 2, even though not final
# Use 2, even though not final
(
'zc.recipe.egg'
,
'>=2.0.6'
),
(
'zc.recipe.egg'
,
lambda
:
'>=2.0.6'
),
)
)
if
k
not
in
versions
if
k
not
in
versions
))
))
...
@@ -440,6 +501,9 @@ class Buildout(DictMixin):
...
@@ -440,6 +501,9 @@ class Buildout(DictMixin):
sectionkey
=
data
[
'buildout'
][
name
]
sectionkey
=
data
[
'buildout'
][
name
]
origdir
=
sectionkey
.
value
origdir
=
sectionkey
.
value
src
=
sectionkey
.
source
src
=
sectionkey
.
source
if
not
origdir
:
del
data
[
'buildout'
][
name
]
continue
if
'${'
in
origdir
:
if
'${'
in
origdir
:
continue
continue
if
not
os
.
path
.
isabs
(
origdir
):
if
not
os
.
path
.
isabs
(
origdir
):
...
@@ -468,6 +532,9 @@ class Buildout(DictMixin):
...
@@ -468,6 +532,9 @@ class Buildout(DictMixin):
self
.
_raw
=
_unannotate
(
data
)
self
.
_raw
=
_unannotate
(
data
)
self
.
_data
=
{}
self
.
_data
=
{}
self
.
_parts
=
[]
self
.
_parts
=
[]
self
.
_initializing
=
[]
self
.
_signature_cache
=
{}
self
.
_default_requirement
=
None
# provide some defaults before options are parsed
# provide some defaults before options are parsed
# because while parsing options those attributes might be
# because while parsing options those attributes might be
...
@@ -502,6 +569,7 @@ class Buildout(DictMixin):
...
@@ -502,6 +569,7 @@ class Buildout(DictMixin):
self
.
newest
=
((
not
self
.
offline
)
and
self
.
newest
=
((
not
self
.
offline
)
and
bool_option
(
buildout_section
,
'newest'
)
bool_option
(
buildout_section
,
'newest'
)
)
)
self
.
dry_run
=
(
buildout_section
[
'dry-run'
]
==
'true'
)
##################################################################
##################################################################
## WARNING!!!
## WARNING!!!
...
@@ -533,6 +601,42 @@ class Buildout(DictMixin):
...
@@ -533,6 +601,42 @@ class Buildout(DictMixin):
options
[
'installed'
]
=
os
.
path
.
join
(
options
[
'directory'
],
options
[
'installed'
]
=
os
.
path
.
join
(
options
[
'directory'
],
options
[
'installed'
])
options
[
'installed'
])
# Extra paths to scan for already installed distributions.
extra_paths
=
options
[
'extra-paths'
]
if
extra_paths
==
'sys.path'
:
# special case: sys.path
extra_paths
=
sys
.
path
options
[
'extra-paths'
]
=
' '
.
join
(
extra_paths
)
elif
extra_paths
==
'legacy'
:
# special case: legacy behavior
# this case is why this is done before setting easy_install
# versions and other options, to get the legacy behavior.
# XXX: These 'sorted' calls correspond to the original behavior,
# but they are quite problematic, as other distributions for
# zc.buildout, pip, wheel and setuptools may take precedence
# over the ones currently running.
old_extra_paths
=
zc
.
buildout
.
easy_install
.
extra_paths
(
sorted
({
d
.
location
for
d
in
pkg_resources
.
working_set
}))
try
:
buildout_and_setuptools_dists
=
list
(
zc
.
buildout
.
easy_install
.
install
([
'zc.buildout'
],
None
,
check_picked
=
False
))
finally
:
zc
.
buildout
.
easy_install
.
extra_paths
(
old_extra_paths
)
extra_paths
=
sorted
(
{
d
.
location
for
d
in
buildout_and_setuptools_dists
})
options
[
'extra-paths'
]
=
' '
.
join
(
extra_paths
)
elif
extra_paths
==
'zc.buildout'
:
# special case: only zc.buildout and its dependencies
# but in the order they appear in sys.path, unlike legacy
buildout_dists
=
pkg_resources
.
require
(
'zc.buildout'
)
buildout_paths
=
{
d
.
location
for
d
in
buildout_dists
}
extra_paths
=
[
p
for
p
in
sys
.
path
if
p
in
buildout_paths
]
options
[
'extra-paths'
]
=
' '
.
join
(
extra_paths
)
else
:
extra_paths
=
extra_paths
.
split
()
zc
.
buildout
.
easy_install
.
extra_paths
(
extra_paths
)
self
.
_setup_logging
()
self
.
_setup_logging
()
self
.
_setup_socket_timeout
()
self
.
_setup_socket_timeout
()
...
@@ -598,6 +702,19 @@ class Buildout(DictMixin):
...
@@ -598,6 +702,19 @@ class Buildout(DictMixin):
os
.
chdir
(
options
[
'directory'
])
os
.
chdir
(
options
[
'directory'
])
networkcache_section_name
=
options
.
get
(
'networkcache-section'
)
if
networkcache_section_name
:
networkcache_section
=
self
[
networkcache_section_name
]
try
:
from
slapos.libnetworkcache
import
NetworkcacheClient
global
networkcache_client
networkcache_client
=
NetworkcacheClient
(
networkcache_section
)
except
ImportError
:
pass
except
Exception
:
self
.
_logger
.
exception
(
"Failed to setup Networkcache. Continue without."
)
def
_buildout_path
(
self
,
name
):
def
_buildout_path
(
self
,
name
):
if
'${'
in
name
:
if
'${'
in
name
:
return
name
return
name
...
@@ -607,44 +724,86 @@ class Buildout(DictMixin):
...
@@ -607,44 +724,86 @@ class Buildout(DictMixin):
def
bootstrap
(
self
,
args
):
def
bootstrap
(
self
,
args
):
__doing__
=
'Bootstrapping.'
__doing__
=
'Bootstrapping.'
if
os
.
path
.
exists
(
self
[
'buildout'
][
'develop-eggs-directory'
]):
if
os
.
path
.
isdir
(
self
[
'buildout'
][
'develop-eggs-directory'
]):
rmtree
(
self
[
'buildout'
][
'develop-eggs-directory'
])
self
.
_logger
.
debug
(
"Removed existing develop-eggs directory"
)
self
.
_setup_directories
()
self
.
_setup_directories
()
# Now copy buildout and setuptools eggs, and record destination eggs:
# Hack: propagate libnetworkcache soft dependency
entries
=
[]
specs
=
[
'zc.buildout'
]
for
dist
in
zc
.
buildout
.
easy_install
.
buildout_and_setuptools_dists
:
try
:
import
slapos.libnetworkcache
specs
.
append
(
'slapos.libnetworkcache'
)
except
ImportError
:
pass
# Install buildout and dependent eggs following pinned versions.
dest
=
self
[
'buildout'
][
'eggs-directory'
]
path
=
[
self
[
'buildout'
][
'develop-eggs-directory'
]]
if
self
.
offline
:
# Cannot install: just check requirements are already met
path
.
append
(
dest
)
dest
=
None
ws
=
zc
.
buildout
.
easy_install
.
install
(
specs
,
dest
,
links
=
self
.
_links
,
index
=
self
[
'buildout'
].
get
(
'index'
),
path
=
path
,
newest
=
self
.
newest
,
allow_hosts
=
self
.
_allow_hosts
,
)
# If versions aren't pinned or if current modules match,
# nothing will be installed, but then we'll copy them to
# the local eggs or develop-eggs folder just after this.
# XXX Note: except if the current modules are not eggs, in which case
# we'll create .egg-link to them. This applies to packages installed
# in site-packages by pip (.dist-info, not .egg), which in turn would
# cause site-packages to be in the sys.path of the generated script.
# Sort the working set to keep entries with single dists first.
options
=
self
[
'buildout'
]
buildout_dir
=
options
[
'directory'
]
eggs_dir
=
options
[
'eggs-directory'
]
develop_eggs_dir
=
options
[
'develop-eggs-directory'
]
ws
=
zc
.
buildout
.
easy_install
.
sort_working_set
(
ws
,
buildout_dir
=
buildout_dir
,
eggs_dir
=
eggs_dir
,
develop_eggs_dir
=
develop_eggs_dir
)
# Now copy buildout and setuptools eggs, and record destination eggs.
# XXX Note: dists using .dist-info format - e.g. packages installed by
# pip in site-packages - will be seen as develop dists and not copied.
egg_entries
=
[]
link_dists
=
[]
for
dist
in
ws
:
if
dist
.
precedence
==
pkg_resources
.
DEVELOP_DIST
:
if
dist
.
precedence
==
pkg_resources
.
DEVELOP_DIST
:
dest
=
os
.
path
.
join
(
self
[
'buildout'
][
'develop-eggs-directory'
],
dest
=
os
.
path
.
join
(
self
[
'buildout'
][
'develop-eggs-directory'
],
dist
.
key
+
'.egg-link'
)
dist
.
key
+
'.egg-link'
)
with
open
(
dest
,
'w'
)
as
fh
:
with
open
(
dest
,
'w'
)
as
fh
:
fh
.
write
(
dist
.
location
)
fh
.
write
(
dist
.
location
)
entries
.
append
(
dist
.
location
)
link_dists
.
append
(
dist
)
else
:
else
:
dest
=
os
.
path
.
join
(
self
[
'buildout'
][
'eggs-directory'
],
dest
=
os
.
path
.
join
(
self
[
'buildout'
][
'eggs-directory'
],
os
.
path
.
basename
(
dist
.
location
))
os
.
path
.
basename
(
dist
.
location
))
entries
.
append
(
dest
)
e
gg_e
ntries
.
append
(
dest
)
if
not
os
.
path
.
exists
(
dest
):
if
not
os
.
path
.
exists
(
dest
):
if
os
.
path
.
isdir
(
dist
.
location
):
if
os
.
path
.
isdir
(
dist
.
location
):
shutil
.
copytree
(
dist
.
location
,
dest
)
shutil
.
copytree
(
dist
.
location
,
dest
)
else
:
else
:
shutil
.
copy2
(
dist
.
location
,
dest
)
shutil
.
copy2
(
dist
.
location
,
dest
)
# Create buildout script
# Recreate a working set with the potentially-new paths after copying.
ws
=
pkg_resources
.
WorkingSet
(
entries
)
# We keep the eggs dists first since we know their locations contain a
# single dist. We add the other dists manually to avoid activating any
# unneded dists at the same location, and we can because these are the
# same dists as before as they were not copied.
ws
=
pkg_resources
.
WorkingSet
(
egg_entries
)
for
dist
in
link_dists
:
ws
.
add
(
dist
)
ws
.
require
(
'zc.buildout'
)
ws
.
require
(
'zc.buildout'
)
options
=
self
[
'buildout'
]
eggs_dir
=
options
[
'eggs-directory'
]
# Create buildout script
develop_eggs_dir
=
options
[
'develop-eggs-directory'
]
ws
=
zc
.
buildout
.
easy_install
.
sort_working_set
(
ws
,
eggs_dir
=
eggs_dir
,
develop_eggs_dir
=
develop_eggs_dir
)
zc
.
buildout
.
easy_install
.
scripts
(
zc
.
buildout
.
easy_install
.
scripts
(
[
'zc.buildout'
],
ws
,
sys
.
executable
,
[
'zc.buildout'
],
ws
,
sys
.
executable
,
options
[
'bin-directory'
],
options
[
'bin-directory'
],
...
@@ -692,6 +851,19 @@ class Buildout(DictMixin):
...
@@ -692,6 +851,19 @@ class Buildout(DictMixin):
@
command
@
command
def
install
(
self
,
install_args
):
def
install
(
self
,
install_args
):
try
:
self
.
_install_parts
(
install_args
)
finally
:
if
self
.
installed_part_options
is
not
None
:
try
:
self
.
_save_installed_options
()
finally
:
del
self
.
installed_part_options
if
self
.
show_picked_versions
or
self
.
update_versions_file
:
self
.
_print_picked_versions
()
self
.
_unload_extensions
()
def
_install_parts
(
self
,
install_args
):
__doing__
=
'Installing.'
__doing__
=
'Installing.'
self
.
_load_extensions
()
self
.
_load_extensions
()
...
@@ -705,8 +877,8 @@ class Buildout(DictMixin):
...
@@ -705,8 +877,8 @@ class Buildout(DictMixin):
self
.
_maybe_upgrade
()
self
.
_maybe_upgrade
()
# load installed data
# load installed data
(
installed_part_options
,
installed_exists
installed_part_options
=
self
.
_read_installed_part_options
()
)
=
self
.
_read_installed_part_options
()
installed_parts
=
installed_part_options
[
'buildout'
][
'parts'
].
split
()
# Remove old develop eggs
# Remove old develop eggs
self
.
_uninstall
(
self
.
_uninstall
(
...
@@ -719,21 +891,15 @@ class Buildout(DictMixin):
...
@@ -719,21 +891,15 @@ class Buildout(DictMixin):
installed_part_options
[
'buildout'
][
'installed_develop_eggs'
installed_part_options
[
'buildout'
][
'installed_develop_eggs'
]
=
installed_develop_eggs
]
=
installed_develop_eggs
if
installed_exists
:
# From now, the caller will update the .installed.cfg at return.
self
.
_update_installed
(
self
.
installed_part_options
=
installed_part_options
installed_develop_eggs
=
installed_develop_eggs
)
# get configured and installed part lists
conf_parts
=
self
[
'buildout'
][
'parts'
]
conf_parts
=
conf_parts
and
conf_parts
.
split
()
or
[]
installed_parts
=
installed_part_options
[
'buildout'
][
'parts'
]
installed_parts
=
installed_parts
and
installed_parts
.
split
()
or
[]
install_parts
=
self
[
'buildout'
][
'parts'
]
if
install_args
:
if
install_args
:
install_parts
=
install_args
install_parts
=
install_args
uninstall_missing
=
False
uninstall_missing
=
False
else
:
else
:
install_parts
=
conf_parts
install_parts
=
install_parts
.
split
()
uninstall_missing
=
True
uninstall_missing
=
True
# load and initialize recipes
# load and initialize recipes
...
@@ -750,68 +916,79 @@ class Buildout(DictMixin):
...
@@ -750,68 +916,79 @@ class Buildout(DictMixin):
_save_options
(
section
,
self
[
section
],
sys
.
stdout
)
_save_options
(
section
,
self
[
section
],
sys
.
stdout
)
print_
()
print_
()
del
self
.
_signature_cache
# compute new part recipe signatures
self
.
_compute_part_signatures
(
install_parts
)
# uninstall parts that are no-longer used or who's configs
# uninstall parts that are no-longer used or who's configs
# have changed
# have changed
if
self
.
_logger
.
getEffectiveLevel
()
<
logging
.
DEBUG
:
reinstall_reason_score
=
-
1
elif
int
(
os
.
getenv
(
'BUILDOUT_INFO_REINSTALL_REASON'
)
or
1
):
# We rely on the fact that installed_parts is sorted according to
# dependencies (unless install_args). This is not the case of
# installed_parts.
reinstall_reason_score
=
len
(
installed_parts
)
else
:
# Provide a way to disable in tests
# or we'd have to update all recipe eggs.
reinstall_reason_score
=
0
reinstall_reason
=
None
for
part
in
reversed
(
installed_parts
):
for
part
in
reversed
(
installed_parts
):
if
part
in
install_parts
:
try
:
part_index
=
install_parts
.
index
(
part
)
except
ValueError
:
if
not
uninstall_missing
:
continue
else
:
old_options
=
installed_part_options
[
part
].
copy
()
old_options
=
installed_part_options
[
part
].
copy
()
installed_files
=
old_options
.
pop
(
'__buildout_installed__'
)
installed_files
=
old_options
.
pop
(
'__buildout_installed__'
)
new_options
=
self
.
get
(
part
)
new_options
=
self
.
get
(
part
)
.
copy
()
if
old_options
==
new_options
:
if
old_options
==
new_options
:
# The options are the same, but are all of the
# The options are the same, but are all of the
# installed files still there? If not, we should
# installed files still there? If not, we should
# reinstall.
# reinstall.
if
not
installed_files
:
if
not
installed_files
:
continue
continue
for
f
in
installed_files
.
split
(
'
\
n
'
):
for
installed_path
in
installed_files
.
split
(
'
\
n
'
):
if
not
os
.
path
.
exists
(
self
.
_buildout_path
(
f
)):
if
not
os
.
path
.
exists
(
self
.
_buildout_path
(
installed_path
)):
break
break
else
:
else
:
continue
continue
else
:
installed_path
=
None
# output debugging info
if
part_index
<
reinstall_reason_score
:
if
self
.
_logger
.
getEffectiveLevel
()
<
logging
.
DEBUG
:
reinstall_reason_score
=
part_index
for
k
in
old_options
:
reinstall_reason
=
(
if
k
not
in
new_options
:
part
,
old_options
,
new_options
,
installed_path
)
self
.
_logger
.
debug
(
"Part %s, dropped option %s."
,
elif
reinstall_reason_score
<
0
:
part
,
k
)
self
.
_log_reinstall_reason
(
logging
.
DEBUG
,
elif
old_options
[
k
]
!=
new_options
[
k
]:
part
,
old_options
,
new_options
,
installed_path
)
self
.
_logger
.
debug
(
"Part %s, option %s changed:
\
n
%r != %r"
,
part
,
k
,
new_options
[
k
],
old_options
[
k
],
)
for
k
in
new_options
:
if
k
not
in
old_options
:
self
.
_logger
.
debug
(
"Part %s, new option %s."
,
part
,
k
)
elif
not
uninstall_missing
:
continue
self
.
_uninstall_part
(
part
,
installed_part_options
)
self
.
_uninstall_part
(
part
,
installed_part_options
)
installed_parts
=
[
p
for
p
in
installed_parts
if
p
!=
part
]
installed_parts
=
[
p
for
p
in
installed_parts
if
p
!=
part
]
installed_part_options
[
'buildout'
][
'parts'
]
=
(
if
installed_exists
:
' '
.
join
(
installed_parts
))
self
.
_update_installed
(
parts
=
' '
.
join
(
installed_parts
))
if
reinstall_reason
:
self
.
_log_reinstall_reason
(
logging
.
INFO
,
*
reinstall_reason
)
# Check for unused buildout options:
# Check for unused buildout options:
_check_for_unused_options_in_section
(
self
,
'buildout'
)
_check_for_unused_options_in_section
(
self
,
'buildout'
)
# install new parts
# install new parts
all_installed_paths
=
{}
for
part
in
install_parts
:
for
part
in
install_parts
:
signature
=
self
[
part
].
pop
(
'__buildout_signature__'
)
signature
=
self
[
part
].
pop
(
'__buildout_signature__'
)
saved_options
=
self
[
part
].
copy
()
saved_options
=
self
[
part
].
copy
()
recipe
=
self
[
part
].
recipe
recipe
=
self
[
part
].
recipe
if
part
in
installed_parts
:
# update
if
part
in
installed_parts
:
# update
need_to_save_installed
=
False
__doing__
=
'Updating %s.'
,
part
__doing__
=
'Updating %s.'
,
part
self
.
_logger
.
info
(
*
__doing__
)
self
.
_logger
.
info
(
*
__doing__
)
if
self
.
dry_run
:
continue
old_options
=
installed_part_options
[
part
]
old_options
=
installed_part_options
[
part
]
old_
installed_files
=
old_options
[
'__buildout_installed__'
]
installed_files
=
old_options
[
'__buildout_installed__'
]
try
:
try
:
update
=
recipe
.
update
update
=
recipe
.
update
...
@@ -823,88 +1000,84 @@ class Buildout(DictMixin):
...
@@ -823,88 +1000,84 @@ class Buildout(DictMixin):
part
)
part
)
try
:
try
:
install
ed_files
=
self
[
part
].
_call
(
update
)
updat
ed_files
=
self
[
part
].
_call
(
update
)
except
:
except
Exception
:
installed_parts
.
remove
(
part
)
installed_parts
.
remove
(
part
)
self
.
_uninstall
(
old_installed_files
)
self
.
_uninstall
(
installed_files
)
if
installed_exists
:
installed_part_options
[
'buildout'
][
'parts'
]
=
(
self
.
_update_installed
(
' '
.
join
(
installed_parts
))
parts
=
' '
.
join
(
installed_parts
))
raise
raise
old_installed_files
=
old_installed_files
.
split
(
'
\
n
'
)
installed_files
=
set
(
installed_files
.
split
(
'
\
n
'
))
\
if
installed_files
is
None
:
if
installed_files
else
set
()
installed_files
=
old_installed_files
if
updated_files
:
else
:
(
installed_files
.
add
if
isinstance
(
updated_files
,
str
)
else
if
isinstance
(
installed_files
,
str
):
installed_files
.
update
)(
updated_files
)
installed_files
=
[
installed_files
]
else
:
installed_files
=
list
(
installed_files
)
need_to_save_installed
=
[
p
for
p
in
installed_files
if
p
not
in
old_installed_files
]
if
need_to_save_installed
:
installed_files
=
(
old_installed_files
+
need_to_save_installed
)
else
:
# install
else
:
# install
need_to_save_installed
=
True
__doing__
=
'Installing %s.'
,
part
__doing__
=
'Installing %s.'
,
part
self
.
_logger
.
info
(
*
__doing__
)
self
.
_logger
.
info
(
*
__doing__
)
if
self
.
dry_run
:
continue
installed_files
=
self
[
part
].
_call
(
recipe
.
install
)
installed_files
=
self
[
part
].
_call
(
recipe
.
install
)
if
installed_files
is
None
:
if
installed_files
is
None
:
self
.
_logger
.
warning
(
self
.
_logger
.
warning
(
"The %s install returned None. A path or "
"The %s install returned None. A path or "
"iterable os paths should be returned."
,
"iterable os paths should be returned."
,
part
)
part
)
installed_files
=
()
elif
installed_files
:
elif
isinstance
(
installed_files
,
str
):
installed_files
=
({
installed_files
}
installed_files
=
[
installed_files
]
if
isinstance
(
installed_files
,
str
)
else
set
(
installed_files
))
if
installed_files
:
conflicts
=
installed_files
.
intersection
(
all_installed_paths
)
if
conflicts
:
self
.
_error
(
"The following paths are already"
" installed by other sections: %r"
,
{
x
:
all_installed_paths
[
x
]
for
x
in
conflicts
})
all_installed_paths
.
update
(
dict
.
fromkeys
(
installed_files
,
part
))
installed_files
=
'
\
n
'
.
join
(
sorted
(
installed_files
))
else
:
else
:
installed_files
=
list
(
installed_files
)
installed_files
=
''
installed_part_options
[
part
]
=
saved_options
saved_options
[
'__buildout_installed__'
]
=
installed_files
saved_options
[
'__buildout_installed__'
]
=
'
\
n
'
.
join
(
installed_files
)
saved_options
[
'__buildout_signature__'
]
=
signature
saved_options
[
'__buildout_signature__'
]
=
signature
installed_part_options
[
part
]
=
saved_options
i
nstalled_parts
=
[
p
for
p
in
installed_parts
if
p
!=
part
]
i
f
part
not
in
installed_parts
:
installed_parts
.
append
(
part
)
installed_parts
.
append
(
part
)
_check_for_unused_options_in_section
(
self
,
part
)
if
need_to_save_installed
:
installed_part_options
[
'buildout'
][
'parts'
]
=
(
installed_part_options
[
'buildout'
][
'parts'
]
=
(
' '
.
join
(
installed_parts
))
' '
.
join
(
installed_parts
))
self
.
_save_installed_options
(
installed_part_options
)
_check_for_unused_options_in_section
(
self
,
part
)
installed_exists
=
True
else
:
assert
installed_exists
self
.
_update_installed
(
parts
=
' '
.
join
(
installed_parts
))
if
installed_develop_eggs
:
if
not
installed_exists
:
self
.
_save_installed_options
(
installed_part_options
)
elif
(
not
installed_parts
)
and
installed_exists
:
os
.
remove
(
self
[
'buildout'
][
'installed'
])
if
self
.
show_picked_versions
or
self
.
update_versions_file
:
if
self
.
_log_level
<
logging
.
INFO
:
self
.
_print_picked_versions
()
self
.
_save_installed_options
()
self
.
_unload_extensions
()
def
_update_installed
(
self
,
**
buildout_options
):
def
_log_reinstall_reason
(
self
,
level
,
part
,
installed
=
self
[
'buildout'
][
'installed'
]
old_options
,
new_options
,
missing
):
f
=
open
(
installed
,
'a'
)
log
=
self
.
_logger
.
log
f
.
write
(
'
\
n
[buildout]
\
n
'
)
if
missing
:
for
option
,
value
in
list
(
buildout_options
.
items
()):
log
(
level
,
"Part %s, missing path: %s"
,
part
,
missing
)
_save_option
(
option
,
value
,
f
)
return
f
.
close
()
for
k
in
old_options
:
if
k
not
in
new_options
:
log
(
level
,
"Part %s, dropped option %s."
,
part
,
k
)
elif
old_options
[
k
]
!=
new_options
[
k
]:
log
(
level
,
"Part %s, option %s changed: %r != %r"
,
part
,
k
,
new_options
[
k
],
old_options
[
k
])
for
k
in
new_options
:
if
k
not
in
old_options
:
log
(
level
,
"Part %s, new option %s."
,
part
,
k
)
def
_uninstall_part
(
self
,
part
,
installed_part_options
):
def
_uninstall_part
(
self
,
part
,
installed_part_options
):
# uninstall part
# uninstall part
__doing__
=
'Uninstalling %s.'
,
part
__doing__
=
'Uninstalling %s.'
,
part
self
.
_logger
.
info
(
*
__doing__
)
self
.
_logger
.
info
(
*
__doing__
)
if
self
.
dry_run
:
return
# run uninstall recipe
# run uninstall recipe
recipe
,
entry
=
_recipe
(
installed_part_options
[
part
])
recipe
,
entry
=
_recipe
(
installed_part_options
[
part
])
...
@@ -990,17 +1163,6 @@ class Buildout(DictMixin):
...
@@ -990,17 +1163,6 @@ class Buildout(DictMixin):
self
.
_logger
.
warning
(
self
.
_logger
.
warning
(
"Unexpected entry, %r, in develop-eggs directory."
,
f
)
"Unexpected entry, %r, in develop-eggs directory."
,
f
)
def
_compute_part_signatures
(
self
,
parts
):
# Compute recipe signature and add to options
for
part
in
parts
:
options
=
self
.
get
(
part
)
if
options
is
None
:
options
=
self
[
part
]
=
{}
recipe
,
entry
=
_recipe
(
options
)
req
=
pkg_resources
.
Requirement
.
parse
(
recipe
)
sig
=
_dists_sig
(
pkg_resources
.
working_set
.
resolve
([
req
]))
options
[
'__buildout_signature__'
]
=
' '
.
join
(
sig
)
def
_read_installed_part_options
(
self
):
def
_read_installed_part_options
(
self
):
old
=
self
[
'buildout'
][
'installed'
]
old
=
self
[
'buildout'
][
'installed'
]
if
old
and
os
.
path
.
isfile
(
old
):
if
old
and
os
.
path
.
isfile
(
old
):
...
@@ -1016,11 +1178,9 @@ class Buildout(DictMixin):
...
@@ -1016,11 +1178,9 @@ class Buildout(DictMixin):
options
[
option
]
=
value
options
[
option
]
=
value
result
[
section
]
=
self
.
Options
(
self
,
section
,
options
)
result
[
section
]
=
self
.
Options
(
self
,
section
,
options
)
return
result
,
True
return
result
else
:
else
:
return
({
'buildout'
:
self
.
Options
(
self
,
'buildout'
,
{
'parts'
:
''
})},
return
{
'buildout'
:
self
.
Options
(
self
,
'buildout'
,
{
'parts'
:
''
})}
False
,
)
def
_uninstall
(
self
,
installed
):
def
_uninstall
(
self
,
installed
):
for
f
in
installed
.
split
(
'
\
n
'
):
for
f
in
installed
.
split
(
'
\
n
'
):
...
@@ -1061,16 +1221,41 @@ class Buildout(DictMixin):
...
@@ -1061,16 +1221,41 @@ class Buildout(DictMixin):
return
' '
.
join
(
installed
)
return
' '
.
join
(
installed
)
def
_save_installed_options
(
self
,
installed_options
):
def
_save_installed_options
(
self
):
installed
=
self
[
'buildout'
][
'installed'
]
if
self
.
dry_run
:
if
not
installed
:
return
return
f
=
open
(
installed
,
'w'
)
installed_path
=
self
[
'buildout'
][
'installed'
]
_save_options
(
'buildout'
,
installed_options
[
'buildout'
],
f
)
if
not
installed_path
:
for
part
in
installed_options
[
'buildout'
][
'parts'
].
split
():
return
print_
(
file
=
f
)
installed_part_options
=
self
.
installed_part_options
_save_options
(
part
,
installed_options
[
part
],
f
)
buildout
=
installed_part_options
[
'buildout'
]
f
.
close
()
installed_parts
=
buildout
[
'parts'
]
if
installed_parts
or
buildout
[
'installed_develop_eggs'
]:
new
=
StringIO
()
_save_options
(
'buildout'
,
buildout
,
new
)
for
part
in
installed_parts
.
split
():
new
.
write
(
'
\
n
'
)
_save_options
(
part
,
installed_part_options
[
part
],
new
)
new
=
new
.
getvalue
()
try
:
with
open
(
installed_path
)
as
f
:
save
=
f
.
read
(
1
+
len
(
new
))
!=
new
except
IOError
as
e
:
if
e
.
errno
!=
errno
.
ENOENT
:
raise
save
=
True
if
save
:
installed_tmp
=
installed_path
+
".tmp"
try
:
with
open
(
installed_tmp
,
"w"
)
as
f
:
f
.
write
(
new
)
f
.
flush
()
os
.
fsync
(
f
.
fileno
())
os
.
rename
(
installed_tmp
,
installed_path
)
finally
:
_remove_ignore_missing
(
installed_tmp
)
else
:
_remove_ignore_missing
(
installed_path
)
def
_error
(
self
,
message
,
*
args
):
def
_error
(
self
,
message
,
*
args
):
raise
zc
.
buildout
.
UserError
(
message
%
args
)
raise
zc
.
buildout
.
UserError
(
message
%
args
)
...
@@ -1135,8 +1320,17 @@ class Buildout(DictMixin):
...
@@ -1135,8 +1320,17 @@ class Buildout(DictMixin):
if
not
self
.
newest
:
if
not
self
.
newest
:
return
return
# Hack: propagate libnetworkcache soft dependency
# XXX just zc.buildout should suffice, then iter over projects in ws
specs
=
[
'zc.buildout'
,
'setuptools'
,
'pip'
,
'wheel'
]
try
:
import
slapos.libnetworkcache
specs
.
append
(
'slapos.libnetworkcache'
)
except
ImportError
:
pass
ws
=
zc
.
buildout
.
easy_install
.
install
(
ws
=
zc
.
buildout
.
easy_install
.
install
(
(
'zc.buildout'
,
'setuptools'
,
'pip'
,
'wheel'
)
,
specs
,
self
[
'buildout'
][
'eggs-directory'
],
self
[
'buildout'
][
'eggs-directory'
],
links
=
self
[
'buildout'
].
get
(
'find-links'
,
''
).
split
(),
links
=
self
[
'buildout'
].
get
(
'find-links'
,
''
).
split
(),
index
=
self
[
'buildout'
].
get
(
'index'
),
index
=
self
[
'buildout'
].
get
(
'index'
),
...
@@ -1146,7 +1340,7 @@ class Buildout(DictMixin):
...
@@ -1146,7 +1340,7 @@ class Buildout(DictMixin):
upgraded
=
[]
upgraded
=
[]
for
project
in
'zc.buildout'
,
'setuptools'
,
'pip'
,
'wheel'
:
for
project
in
specs
:
req
=
pkg_resources
.
Requirement
.
parse
(
project
)
req
=
pkg_resources
.
Requirement
.
parse
(
project
)
dist
=
ws
.
find
(
req
)
dist
=
ws
.
find
(
req
)
importlib
.
import_module
(
project
)
importlib
.
import_module
(
project
)
...
@@ -1184,10 +1378,12 @@ class Buildout(DictMixin):
...
@@ -1184,10 +1378,12 @@ class Buildout(DictMixin):
# the new dist is different, so we've upgraded.
# the new dist is different, so we've upgraded.
# Update the scripts and return True
# Update the scripts and return True
options
=
self
[
'buildout'
]
options
=
self
[
'buildout'
]
buildout_dir
=
options
[
'directory'
]
eggs_dir
=
options
[
'eggs-directory'
]
eggs_dir
=
options
[
'eggs-directory'
]
develop_eggs_dir
=
options
[
'develop-eggs-directory'
]
develop_eggs_dir
=
options
[
'develop-eggs-directory'
]
ws
=
zc
.
buildout
.
easy_install
.
sort_working_set
(
ws
=
zc
.
buildout
.
easy_install
.
sort_working_set
(
ws
,
ws
,
buildout_dir
=
buildout_dir
,
eggs_dir
=
eggs_dir
,
eggs_dir
=
eggs_dir
,
develop_eggs_dir
=
develop_eggs_dir
develop_eggs_dir
=
develop_eggs_dir
)
)
...
@@ -1297,6 +1493,7 @@ class Buildout(DictMixin):
...
@@ -1297,6 +1493,7 @@ class Buildout(DictMixin):
os
.
write
(
fd
,
(
zc
.
buildout
.
easy_install
.
runsetup_template
%
dict
(
os
.
write
(
fd
,
(
zc
.
buildout
.
easy_install
.
runsetup_template
%
dict
(
setupdir
=
os
.
path
.
dirname
(
setup
),
setupdir
=
os
.
path
.
dirname
(
setup
),
setup
=
setup
,
setup
=
setup
,
path_list
=
[],
__file__
=
setup
,
__file__
=
setup
,
)).
encode
())
)).
encode
())
args
=
[
sys
.
executable
,
tsetup
]
+
args
args
=
[
sys
.
executable
,
tsetup
]
+
args
...
@@ -1356,21 +1553,46 @@ class Buildout(DictMixin):
...
@@ -1356,21 +1553,46 @@ class Buildout(DictMixin):
v
=
v
.
replace
(
os
.
getcwd
(),
base_path
)
v
=
v
.
replace
(
os
.
getcwd
(),
base_path
)
print_
(
"%s =%s"
%
(
k
,
v
))
print_
(
"%s =%s"
%
(
k
,
v
))
def
initialize
(
self
,
options
,
reqs
,
entry
):
recipe_class
=
_install_and_load
(
reqs
,
'zc.buildout'
,
entry
,
self
)
try
:
sig
=
self
.
_signature_cache
[
reqs
]
except
KeyError
:
req
=
pkg_resources
.
Requirement
.
parse
(
reqs
)
sig
=
self
.
_signature_cache
[
reqs
]
=
sorted
(
set
(
_dists_sig
(
pkg_resources
.
working_set
.
resolve
([
req
]))))
self
.
_initializing
.
append
((
options
,
sig
))
try
:
recipe
=
recipe_class
(
self
,
options
.
name
,
options
)
options
[
'__buildout_signature__'
]
finally
:
del
self
.
_initializing
[
-
1
]
return
recipe
def
__getitem__
(
self
,
section
):
def
__getitem__
(
self
,
section
):
__doing__
=
'Getting section %s.'
,
section
__doing__
=
'Getting section %s.'
,
section
try
:
try
:
return
self
.
_data
[
section
]
options
=
self
.
_data
[
section
]
except
KeyError
:
except
KeyError
:
pass
try
:
try
:
data
=
self
.
_raw
[
section
]
data
=
self
.
_raw
[
section
]
except
KeyError
:
except
KeyError
:
raise
MissingSection
(
section
)
raise
MissingSection
(
section
)
e
=
data
.
get
(
'__unsupported_conditional_expression__'
)
if
e
:
raise
e
options
=
self
.
Options
(
self
,
section
,
data
)
options
=
self
.
Options
(
self
,
section
,
data
)
self
.
_data
[
section
]
=
options
self
.
_data
[
section
]
=
options
options
.
_initialize
()
options
.
_initialize
()
if
self
.
_initializing
:
caller
=
self
.
_initializing
[
-
1
][
0
]
if
'buildout'
!=
section
and
not
(
section
in
caller
.
depends
or
# Do not only check the caller,
# because of circular dependencies during substitutions.
section
in
(
x
[
0
].
name
for
x
in
self
.
_initializing
)):
caller
.
depends
.
add
(
section
)
return
options
return
options
def
__setitem__
(
self
,
name
,
data
):
def
__setitem__
(
self
,
name
,
data
):
...
@@ -1380,10 +1602,6 @@ class Buildout(DictMixin):
...
@@ -1380,10 +1602,6 @@ class Buildout(DictMixin):
self
[
name
]
# Add to parts
self
[
name
]
# Add to parts
def
parse
(
self
,
data
):
def
parse
(
self
,
data
):
try
:
from
cStringIO
import
StringIO
except
ImportError
:
from
io
import
StringIO
import
textwrap
import
textwrap
sections
=
zc
.
buildout
.
configparser
.
parse
(
sections
=
zc
.
buildout
.
configparser
.
parse
(
...
@@ -1409,9 +1627,16 @@ class Buildout(DictMixin):
...
@@ -1409,9 +1627,16 @@ class Buildout(DictMixin):
def
__len__
(
self
):
def
__len__
(
self
):
return
len
(
self
.
_raw
)
return
len
(
self
.
_raw
)
_install_and_load_cache
=
{}
def
_install_and_load
(
spec
,
group
,
entry
,
buildout
):
def
_install_and_load
(
spec
,
group
,
entry
,
buildout
):
__doing__
=
'Loading recipe %r.'
,
spec
__doing__
=
'Loading recipe %r.'
,
spec
key
=
spec
,
group
,
entry
try
:
return
_install_and_load_cache
[
key
]
except
KeyError
:
pass
try
:
try
:
req
=
pkg_resources
.
Requirement
.
parse
(
spec
)
req
=
pkg_resources
.
Requirement
.
parse
(
spec
)
...
@@ -1438,8 +1663,9 @@ def _install_and_load(spec, group, entry, buildout):
...
@@ -1438,8 +1663,9 @@ def _install_and_load(spec, group, entry, buildout):
)
)
__doing__
=
'Loading %s recipe entry %s:%s.'
,
group
,
spec
,
entry
__doing__
=
'Loading %s recipe entry %s:%s.'
,
group
,
spec
,
entry
re
turn
pkg_resources
.
load_entry_point
(
re
sult
=
_install_and_load_cache
[
key
]
=
pkg_resources
.
load_entry_point
(
req
.
project_name
,
group
,
entry
)
req
.
project_name
,
group
,
entry
)
return
result
except
Exception
:
except
Exception
:
v
=
sys
.
exc_info
()[
1
]
v
=
sys
.
exc_info
()[
1
]
...
@@ -1457,6 +1683,7 @@ class Options(DictMixin):
...
@@ -1457,6 +1683,7 @@ class Options(DictMixin):
self
.
_raw
=
data
self
.
_raw
=
data
self
.
_cooked
=
{}
self
.
_cooked
=
{}
self
.
_data
=
{}
self
.
_data
=
{}
self
.
depends
=
set
()
def
_initialize
(
self
):
def
_initialize
(
self
):
name
=
self
.
name
name
=
self
.
name
...
@@ -1465,6 +1692,8 @@ class Options(DictMixin):
...
@@ -1465,6 +1692,8 @@ class Options(DictMixin):
if
'<'
in
self
.
_raw
:
if
'<'
in
self
.
_raw
:
self
.
_raw
=
self
.
_do_extend_raw
(
name
,
self
.
_raw
,
[])
self
.
_raw
=
self
.
_do_extend_raw
(
name
,
self
.
_raw
,
[])
default
=
self
.
buildout
.
_default_requirement
# force substitutions
# force substitutions
for
k
,
v
in
sorted
(
self
.
_raw
.
items
()):
for
k
,
v
in
sorted
(
self
.
_raw
.
items
()):
if
'${'
in
v
:
if
'${'
in
v
:
...
@@ -1478,16 +1707,24 @@ class Options(DictMixin):
...
@@ -1478,16 +1707,24 @@ class Options(DictMixin):
self
.
buildout
[
dname
]
self
.
buildout
[
dname
]
if
self
.
get
(
'recipe'
):
if
self
.
get
(
'recipe'
):
self
.
initialize
()
if
default
:
self
.
depends
.
add
(
default
)
self
.
recipe
=
self
.
buildout
.
initialize
(
self
,
*
_recipe
(
self
.
_data
))
self
.
buildout
.
_parts
.
append
(
name
)
self
.
buildout
.
_parts
.
append
(
name
)
def
initialize
(
self
):
m
=
md5
()
reqs
,
entry
=
_recipe
(
self
.
_data
)
# _profile_base_location_ is ignored in signatures, so that two sections
buildout
=
self
.
buildout
# at different URLs can have same signature
recipe_class
=
_install_and_load
(
reqs
,
'zc.buildout'
,
entry
,
buildout
)
_profile_base_location_
=
self
.
get
(
'_profile_base_location_'
)
# access values through .get() instead of .items() to detect unused keys
name
=
self
.
name
for
key
in
sorted
(
self
.
keys
()):
self
.
recipe
=
recipe_class
(
buildout
,
name
,
self
)
if
key
==
'_profile_base_location_'
:
continue
value
=
self
.
_data
.
get
(
key
,
self
.
_cooked
.
get
(
key
,
self
.
_raw
.
get
(
key
)))
if
_profile_base_location_
:
value
=
value
.
replace
(
_profile_base_location_
,
'${:_profile_base_location_}'
)
m
.
update
((
'%r
\
0
%r
\
0
'
%
(
key
,
value
)).
encode
())
self
.
items_signature
=
'%s:%s'
%
(
name
,
m
.
hexdigest
())
def
_do_extend_raw
(
self
,
name
,
data
,
doing
):
def
_do_extend_raw
(
self
,
name
,
data
,
doing
):
if
name
==
'buildout'
:
if
name
==
'buildout'
:
...
@@ -1511,10 +1748,10 @@ class Options(DictMixin):
...
@@ -1511,10 +1748,10 @@ class Options(DictMixin):
raise
zc
.
buildout
.
UserError
(
"No section named %r"
%
iname
)
raise
zc
.
buildout
.
UserError
(
"No section named %r"
%
iname
)
result
.
update
(
self
.
_do_extend_raw
(
iname
,
raw
,
doing
))
result
.
update
(
self
.
_do_extend_raw
(
iname
,
raw
,
doing
))
result
=
_annotate_section
(
result
,
""
)
_annotate_section
(
result
,
""
)
data
=
_annotate_section
(
copy
.
deepcopy
(
data
),
""
)
data
=
_annotate_section
(
copy
.
deepcopy
(
data
),
""
)
result
=
_update_section
(
result
,
data
)
_update_section
(
result
,
data
)
result
=
_unannotate_section
(
result
)
_unannotate_section
(
result
)
result
.
pop
(
'<'
,
None
)
result
.
pop
(
'<'
,
None
)
return
result
return
result
finally
:
finally
:
...
@@ -1523,11 +1760,25 @@ class Options(DictMixin):
...
@@ -1523,11 +1760,25 @@ class Options(DictMixin):
def
_dosub
(
self
,
option
,
v
):
def
_dosub
(
self
,
option
,
v
):
__doing__
=
'Getting option %s:%s.'
,
self
.
name
,
option
__doing__
=
'Getting option %s:%s.'
,
self
.
name
,
option
seen
=
[(
self
.
name
,
option
)]
seen
=
[(
self
.
name
,
option
)]
v
=
'$$'
.
join
([
self
.
_sub
(
s
,
seen
)
for
s
in
v
.
split
(
'$$'
)])
v
=
'$$'
.
join
([
self
.
_sub
(
s
,
seen
,
last
=
False
)
for
s
in
v
.
split
(
'$$'
)])
self
.
_cooked
[
option
]
=
v
self
.
_cooked
[
option
]
=
v
def
get
(
self
,
option
,
default
=
None
,
seen
=
None
):
def
get
(
self
,
*
args
,
**
kw
):
v
=
self
.
_get
(
*
args
,
**
kw
)
if
hasattr
(
v
,
'startswith'
)
and
v
.
startswith
(
SERIALISED_VALUE_MAGIC
):
v
=
loads
(
v
)
return
v
def
_get
(
self
,
option
,
default
=
None
,
seen
=
None
,
last
=
True
):
# TODO: raise instead of handling a default parameter,
# so that get() never tries to deserialize a default value
# (and then: move deserialization to __getitem__
# and make get() use __getitem__)
try
:
try
:
if
last
:
return
self
.
_data
[
option
].
replace
(
'$${'
,
'${'
)
else
:
return
self
.
_data
[
option
]
return
self
.
_data
[
option
]
except
KeyError
:
except
KeyError
:
pass
pass
...
@@ -1536,6 +1787,16 @@ class Options(DictMixin):
...
@@ -1536,6 +1787,16 @@ class Options(DictMixin):
if
v
is
None
:
if
v
is
None
:
v
=
self
.
_raw
.
get
(
option
)
v
=
self
.
_raw
.
get
(
option
)
if
v
is
None
:
if
v
is
None
:
if
option
==
'__buildout_signature__'
:
buildout
=
self
.
buildout
options
,
sig
=
buildout
.
_initializing
[
-
1
]
if
options
is
self
:
self
.
depends
=
frozenset
(
self
.
depends
)
v
=
self
.
_data
[
option
]
=
' '
.
join
(
sig
+
[
buildout
[
dependency
].
items_signature
for
dependency
in
sorted
(
self
.
depends
)])
return
v
raise
zc
.
buildout
.
UserError
(
"premature access to "
+
option
)
return
default
return
default
__doing__
=
'Getting option %s:%s.'
,
self
.
name
,
option
__doing__
=
'Getting option %s:%s.'
,
self
.
name
,
option
...
@@ -1550,16 +1811,20 @@ class Options(DictMixin):
...
@@ -1550,16 +1811,20 @@ class Options(DictMixin):
)
)
else
:
else
:
seen
.
append
(
key
)
seen
.
append
(
key
)
v
=
'$$'
.
join
([
self
.
_sub
(
s
,
seen
)
for
s
in
v
.
split
(
'$$'
)])
v
=
'$$'
.
join
([
self
.
_sub
(
s
,
seen
,
last
=
False
)
for
s
in
v
.
split
(
'$$'
)])
seen
.
pop
()
seen
.
pop
()
self
.
_data
[
option
]
=
v
self
.
_data
[
option
]
=
v
if
last
:
return
v
.
replace
(
'$${'
,
'${'
)
else
:
return
v
return
v
_template_split
=
re
.
compile
(
'([$]{[^}]*})'
).
split
_template_split
=
re
.
compile
(
'([$]{[^}]*})'
).
split
_simple
=
re
.
compile
(
'[-a-zA-Z0-9 ._]+$'
).
match
_simple
=
re
.
compile
(
'[-a-zA-Z0-9 ._]+$'
).
match
_valid
=
re
.
compile
(
r'\
${[-
a-zA-Z0-9 ._]*:[-a-zA-Z0-9 ._]+}$'
).
match
_valid
=
re
.
compile
(
r'\
${[-
a-zA-Z0-9 ._]*:[-a-zA-Z0-9 ._]+}$'
).
match
def
_sub
(
self
,
template
,
seen
):
def
_sub
(
self
,
template
,
seen
,
last
=
True
):
value
=
self
.
_template_split
(
template
)
value
=
self
.
_template_split
(
template
)
subs
=
[]
subs
=
[]
for
ref
in
value
[
1
::
2
]:
for
ref
in
value
[
1
::
2
]:
...
@@ -1587,7 +1852,14 @@ class Options(DictMixin):
...
@@ -1587,7 +1852,14 @@ class Options(DictMixin):
section
,
option
=
s
section
,
option
=
s
if
not
section
:
if
not
section
:
section
=
self
.
name
section
=
self
.
name
v
=
self
.
buildout
[
section
].
get
(
option
,
None
,
seen
)
options
=
self
else
:
self
.
buildout
.
_initializing
.
append
((
self
,))
try
:
options
=
self
.
buildout
[
section
]
finally
:
del
self
.
buildout
.
_initializing
[
-
1
]
v
=
options
.
_get
(
option
,
None
,
seen
,
last
=
last
)
if
v
is
None
:
if
v
is
None
:
if
option
==
'_buildout_section_name_'
:
if
option
==
'_buildout_section_name_'
:
v
=
self
.
name
v
=
self
.
name
...
@@ -1600,20 +1872,17 @@ class Options(DictMixin):
...
@@ -1600,20 +1872,17 @@ class Options(DictMixin):
return
''
.
join
([
''
.
join
(
v
)
for
v
in
zip
(
value
[::
2
],
subs
)])
return
''
.
join
([
''
.
join
(
v
)
for
v
in
zip
(
value
[::
2
],
subs
)])
def
__getitem__
(
self
,
key
):
def
__getitem__
(
self
,
key
):
try
:
v
=
self
.
get
(
key
,
_MARKER
)
return
self
.
_data
[
key
]
if
v
is
_MARKER
:
except
KeyError
:
pass
v
=
self
.
get
(
key
)
if
v
is
None
:
raise
MissingOption
(
"Missing option: %s:%s"
%
(
self
.
name
,
key
))
raise
MissingOption
(
"Missing option: %s:%s"
%
(
self
.
name
,
key
))
return
v
return
v
def
__setitem__
(
self
,
option
,
value
):
def
__setitem__
(
self
,
option
,
value
):
if
not
re
.
match
(
zc
.
buildout
.
configparser
.
option_name_re
+
'$'
,
option
):
raise
zc
.
buildout
.
UserError
(
"Invalid option name %r"
%
(
option
,
))
if
not
isinstance
(
value
,
str
):
if
not
isinstance
(
value
,
str
):
raise
TypeError
(
'Option values must be strings'
,
value
)
value
=
dumps
(
value
)
self
.
_data
[
option
]
=
value
self
.
_data
[
option
]
=
value
.
replace
(
'${'
,
'$${'
)
def
__delitem__
(
self
,
key
):
def
__delitem__
(
self
,
key
):
if
key
in
self
.
_raw
:
if
key
in
self
.
_raw
:
...
@@ -1641,6 +1910,9 @@ class Options(DictMixin):
...
@@ -1641,6 +1910,9 @@ class Options(DictMixin):
result
=
copy
.
deepcopy
(
self
.
_raw
)
result
=
copy
.
deepcopy
(
self
.
_raw
)
result
.
update
(
self
.
_cooked
)
result
.
update
(
self
.
_cooked
)
result
.
update
(
self
.
_data
)
result
.
update
(
self
.
_data
)
for
key
,
value
in
result
.
items
():
if
value
.
startswith
(
SERIALISED_VALUE_MAGIC
):
result
[
key
]
=
loads
(
value
)
return
result
return
result
def
_call
(
self
,
f
):
def
_call
(
self
,
f
):
...
@@ -1672,9 +1944,28 @@ class Options(DictMixin):
...
@@ -1672,9 +1944,28 @@ class Options(DictMixin):
self
.
name
)
self
.
name
)
return
self
.
_created
return
self
.
_created
def
barrier
(
self
):
"""Set self as a default requirement for not-yet processed parts
This method must be called if this part may alter the processing
of other parts in any way, like modifying environment variables.
In other words, it sets an implicit dependency for these parts.
"""
buildout
=
self
.
buildout
if
not
buildout
.
_initializing
:
raise
zc
.
buildout
.
UserError
(
"Options.barrier() shall only be used during initialization"
)
buildout
.
_default_requirement
=
self
.
name
def
__repr__
(
self
):
def
__repr__
(
self
):
return
repr
(
dict
(
self
))
return
repr
(
dict
(
self
))
def
__eq__
(
self
,
other
):
try
:
return
sorted
(
self
.
items
())
==
sorted
(
other
.
items
())
except
Exception
:
return
super
(
Options
,
self
).
__eq__
(
other
)
Buildout
.
Options
=
Options
Buildout
.
Options
=
Options
_spacey_nl
=
re
.
compile
(
'[
\
t
\
r
\
f
\
v
]*
\
n
[
\
t
\
r
\
f
\
v
\
n
]*'
_spacey_nl
=
re
.
compile
(
'[
\
t
\
r
\
f
\
v
]*
\
n
[
\
t
\
r
\
f
\
v
\
n
]*'
...
@@ -1707,6 +1998,8 @@ def _quote_spacey_nl(match):
...
@@ -1707,6 +1998,8 @@ def _quote_spacey_nl(match):
return
result
return
result
def
_save_option
(
option
,
value
,
f
):
def
_save_option
(
option
,
value
,
f
):
if
not
isinstance
(
value
,
str
):
value
=
dumps
(
value
)
value
=
_spacey_nl
.
sub
(
_quote_spacey_nl
,
value
)
value
=
_spacey_nl
.
sub
(
_quote_spacey_nl
,
value
)
if
value
.
startswith
(
'
\
n
\
t
'
):
if
value
.
startswith
(
'
\
n
\
t
'
):
value
=
'%(__buildout_space_n__)s'
+
value
[
2
:]
value
=
'%(__buildout_space_n__)s'
+
value
[
2
:]
...
@@ -1716,10 +2009,12 @@ def _save_option(option, value, f):
...
@@ -1716,10 +2009,12 @@ def _save_option(option, value, f):
def
_save_options
(
section
,
options
,
f
):
def
_save_options
(
section
,
options
,
f
):
print_
(
'[%s]'
%
section
,
file
=
f
)
print_
(
'[%s]'
%
section
,
file
=
f
)
items
=
list
(
options
.
items
())
try
:
items
.
sort
()
get_option
=
partial
(
options
.
_get
,
last
=
False
)
for
option
,
value
in
items
:
except
AttributeError
:
_save_option
(
option
,
value
,
f
)
get_option
=
options
.
get
for
option
in
sorted
(
options
):
_save_option
(
option
,
get_option
(
option
),
f
)
def
_default_globals
():
def
_default_globals
():
"""Return a mapping of default and precomputed expressions.
"""Return a mapping of default and precomputed expressions.
...
@@ -1801,103 +2096,172 @@ def _default_globals():
...
@@ -1801,103 +2096,172 @@ def _default_globals():
return
globals_defs
return
globals_defs
variable_template_split
=
re
.
compile
(
'([$]{[^}]*})'
).
split
def
_open
(
class
_default_globals
(
dict
):
base
,
filename
,
seen
,
download_options
,
"""
override
,
downloaded
,
user_defaults
Make sure parser context is computed at most once,
):
even if several files are parsed.
"""Open a configuration file and return the result as a dictionary,
And compute some values only if accessed.
Recursively open other files based on buildout options found.
If __getitem__ raises, _doing() calls .get('__doing__'),
but that's not the only reason to subclass dict:
CPython requests it (for performance reasons?). PyPy does not care.
"""
"""
download_options
=
_update_section
(
download_options
,
override
)
# XXX: The following line is only to keep access to the overridden global.
raw_download_options
=
_unannotate_section
(
download_options
)
# If pushed upstream, proper naming would avoid such hack.
newest
=
bool_option
(
raw_download_options
,
'newest'
,
'false'
)
# Meanwhile, the patch consists only in this drop-in class
fallback
=
newest
and
not
(
filename
in
downloaded
)
# and that's easier to maintain.
extends_cache
=
raw_download_options
.
get
(
'extends-cache'
)
_default_globals
=
staticmethod
(
_default_globals
)
if
extends_cache
and
variable_template_split
(
extends_cache
)[
1
::
2
]:
raise
ValueError
(
def
__getitem__
(
self
,
key
):
"extends-cache '%s' may not contain ${section:variable} to expand."
cls
=
self
.
__class__
%
extends_cache
try
:
)
context
=
self
.
context
download
=
zc
.
buildout
.
download
.
Download
(
except
AttributeError
:
raw_download_options
,
cache
=
extends_cache
,
context
=
self
.
context
=
cls
.
_default_globals
()
fallback
=
fallback
,
hash_name
=
True
)
context
[
'sys'
]
=
_sysproxy
(
self
)
is_temp
=
False
try
:
downloaded_filename
=
None
return
context
[
key
]
if
_isurl
(
filename
):
except
KeyError
as
e
:
downloaded_filename
,
is_temp
=
download
(
filename
)
try
:
fp
=
open
(
downloaded_filename
)
value
=
getattr
(
self
,
key
)
base
=
filename
[:
filename
.
rfind
(
'/'
)]
except
AttributeError
:
elif
_isurl
(
base
):
pass
if
os
.
path
.
isabs
(
filename
):
fp
=
open
(
filename
)
base
=
os
.
path
.
dirname
(
filename
)
else
:
filename
=
base
+
'/'
+
filename
downloaded_filename
,
is_temp
=
download
(
filename
)
fp
=
open
(
downloaded_filename
)
base
=
filename
[:
filename
.
rfind
(
'/'
)]
else
:
else
:
filename
=
os
.
path
.
join
(
base
,
filename
)
value
=
context
[
key
]
=
value
()
fp
=
open
(
filename
)
return
value
base
=
os
.
path
.
dirname
(
filename
)
raise
e
# BBB: On Python 3, a bare 'raise' is enough.
downloaded
.
add
(
filename
)
if
filename
in
seen
:
def
multiarch
(
self
):
if
is_temp
:
args
=
os
.
getenv
(
'CC'
)
or
'gcc'
,
'-dumpmachine'
fp
.
close
()
self
[
'__doing__'
]
=
'%r'
,
args
os
.
remove
(
downloaded_filename
)
m
=
subprocess
.
check_output
(
args
,
universal_newlines
=
True
).
rstrip
()
raise
zc
.
buildout
.
UserError
(
"Recursive file include"
,
seen
,
filename
)
del
self
[
'__doing__'
]
return
m
root_config_file
=
not
seen
class
_sysproxy
(
object
):
# BBB: alternate/temporary way to get multiarch value
seen
.
append
(
filename
)
filename_for_logging
=
filename
def
__init__
(
self
,
default_globals
):
if
downloaded_filename
:
self
.
__default_globals
=
default_globals
filename_for_logging
=
'%s (downloaded as %s)'
%
(
filename
,
downloaded_filename
)
result
=
zc
.
buildout
.
configparser
.
parse
(
fp
,
filename_for_logging
,
_default_globals
)
fp
.
close
()
def
__getattr__
(
self
,
name
):
if
is_temp
:
if
name
==
'_multiarch'
:
os
.
remove
(
downloaded_filename
)
default_globals
=
self
.
__default_globals
setattr
(
sys
,
name
,
getattr
(
default_globals
,
name
[
1
:])())
default_globals
.
context
[
'sys'
]
=
sys
return
getattr
(
sys
,
name
)
variable_template_split
=
re
.
compile
(
'([$]{[^}]*})'
).
split
class
_extends
(
object
):
def
__new__
(
cls
,
defaults
,
*
args
):
self
=
super
(
_extends
,
cls
).
__new__
(
cls
)
self
.
seen
=
set
()
self
.
processing
=
[]
self
.
extends
=
[
defaults
]
self
.
_download_options
=
[]
self
.
collect
(
*
args
)
return
self
.
merge
()
def
merge
(
self
):
result
=
{}
for
d
in
self
.
extends
:
_update
(
result
,
d
)
return
result
def
__getattr__
(
self
,
attr
):
if
attr
==
'download_options'
:
# Compute processed options
result_so_far
=
self
.
merge
()
self
.
extends
[:]
=
[
result_so_far
]
value
=
copy
.
deepcopy
(
result_so_far
.
get
(
'buildout'
))
or
{}
# Update with currently-being-processed options
for
options
in
reversed
(
self
.
_download_options
):
_update_section
(
value
,
options
)
value
=
_unannotate_section
(
value
)
setattr
(
self
,
attr
,
value
)
return
value
return
self
.
__getattribute__
(
attr
)
def
collect
(
self
,
result
,
base
,
filename
):
options
=
result
.
get
(
'buildout'
,
{})
options
=
result
.
get
(
'buildout'
,
{})
extends
=
options
.
pop
(
'extends'
,
None
)
extends
=
options
.
pop
(
'extends'
,
''
)
# Sanitize buildout options
if
'extended-by'
in
options
:
if
'extended-by'
in
options
:
raise
zc
.
buildout
.
UserError
(
raise
zc
.
buildout
.
UserError
(
'No-longer supported "extended-by" option found in %s.'
%
'No-longer supported "extended-by" option found in %s.'
%
filename
)
filename
)
result
=
_annotate
(
result
,
filename
)
# Find and expose _profile_base_location_
for
section
in
result
.
values
():
for
value
in
section
.
values
():
if
'${:_profile_base_location_}'
in
value
:
section
[
'_profile_base_location_'
]
=
base
break
_annotate
(
result
,
filename
)
if
root_config_file
and
'buildout'
in
result
:
# Collect extends and unprocessed download options
download_options
=
_update_section
(
self
.
processing
.
append
(
filename
)
download_options
,
result
[
'buildout'
]
self
.
_download_options
.
append
(
options
)
)
for
fextends
in
extends
.
split
():
self
.
open
(
base
,
fextends
)
self
.
extends
.
append
(
result
)
del
self
.
processing
[
-
1
],
self
.
_download_options
[
-
1
]
if
extends
:
def
open
(
self
,
base
,
filename
):
extends
=
extends
.
split
()
# Determine file location
eresult
,
user_defaults
=
_open
(
if
_isurl
(
filename
):
base
,
extends
.
pop
(
0
),
seen
,
download_options
,
override
,
download
=
True
downloaded
,
user_defaults
elif
_isurl
(
base
):
)
download
=
True
for
fname
in
extends
:
filename
=
urljoin
(
base
+
'/'
,
filename
)
next_extend
,
user_defaults
=
_open
(
else
:
base
,
fname
,
seen
,
download_options
,
override
,
download
=
False
downloaded
,
user_defaults
filename
=
os
.
path
.
realpath
(
os
.
path
.
join
(
base
,
os
.
path
.
expanduser
(
filename
)))
# Detect repetitions and loops
if
filename
in
self
.
seen
:
if
filename
in
self
.
processing
:
raise
zc
.
buildout
.
UserError
(
"circular extends: %s"
%
filename
)
return
self
.
seen
.
add
(
filename
)
# Fetch file
is_temp
=
False
try
:
if
download
:
download_options
=
self
.
download_options
extends_cache
=
download_options
.
get
(
'extends-cache'
)
if
extends_cache
and
variable_template_split
(
extends_cache
)[
1
::
2
]:
raise
ValueError
(
"extends-cache '%s' may not contain ${section:variable} to expand."
%
extends_cache
)
)
eresult
=
_update
(
eresult
,
next_extend
)
downloaded_filename
,
is_temp
=
zc
.
buildout
.
download
.
Download
(
result
=
_update
(
eresult
,
result
)
download_options
,
cache
=
extends_cache
,
fallback
=
bool_option
(
download_options
,
'newest'
),
hash_name
=
True
)(
filename
)
filename_for_logging
=
'%s (downloaded as %s)'
%
(
filename
,
downloaded_filename
)
base
=
filename
[:
filename
.
rfind
(
'/'
)]
else
:
else
:
if
user_defaults
:
downloaded_filename
=
filename_for_logging
=
filename
result
=
_update
(
user_defaults
,
result
)
base
=
os
.
path
.
dirname
(
filename
)
user_defaults
=
{}
seen
.
pop
()
with
open
(
downloaded_filename
)
as
fp
:
return
result
,
user_defaults
result
=
zc
.
buildout
.
configparser
.
parse
(
fp
,
filename_for_logging
,
_default_globals
)
finally
:
if
is_temp
:
os
.
remove
(
downloaded_filename
)
return
self
.
collect
(
result
,
base
,
filename
)
ignore_directories
=
'.svn'
,
'CVS'
,
'__pycache__'
,
'.git'
ignore_directories
=
'.svn'
,
'CVS'
,
'__pycache__'
,
'.git'
...
@@ -1946,58 +2310,57 @@ def _dists_sig(dists):
...
@@ -1946,58 +2310,57 @@ def _dists_sig(dists):
continue
continue
seen
.
add
(
dist
)
seen
.
add
(
dist
)
location
=
dist
.
location
location
=
dist
.
location
if
dist
.
precedence
==
pkg_resources
.
DEVELOP_DIST
:
if
(
dist
.
precedence
==
pkg_resources
.
DEVELOP_DIST
and
location
!=
zc
.
buildout
.
easy_install
.
python_lib
and
not
isinstance
(
dist
,
pkg_resources
.
DistInfoDistribution
)):
result
.
append
(
dist
.
project_name
+
'-'
+
_dir_hash
(
location
))
result
.
append
(
dist
.
project_name
+
'-'
+
_dir_hash
(
location
))
else
:
else
:
result
.
append
(
os
.
path
.
basename
(
location
)
)
result
.
append
(
dist
.
project_name
+
'-'
+
dist
.
version
)
return
result
return
result
def
_update_section
(
in
1
,
s2
):
def
_update_section
(
s
1
,
s2
):
s1
=
copy
.
deepcopy
(
in1
)
# Base section 2 on section 1; key-value pairs in s2 override those in s1.
#
Base section 2 on section 1; section 1 is copied, with key-value pairs
#
If there are += or -= operators in s2, process these to add or subtract
# i
n section 2 overriding those in section 1. If there are += or -=
# i
tems (delimited by newlines) from the preexisting values.
#
operators in section 2, process these to add or subtract items (delimited
#
Sort on key, then on + and - operators, so that KEY < KEY + < KEY -, to
#
by newlines) from the preexisting values
.
#
process them in this order if several are defined in the same section
.
s2
=
copy
.
deepcopy
(
s2
)
# avoid mutating the second argument, which is unexpected
# Section s1 is modified in place.
# Sort on key, then on the addition or subtraction operator (+ comes first
)
keysort
=
lambda
x
:
(
x
[
0
].
rstrip
(
' +'
),
x
[
0
].
endswith
(
'+'
)
)
for
k
,
v
in
sorted
(
s2
.
items
(),
key
=
lambda
x
:
(
x
[
0
].
rstrip
(
' +'
),
x
[
0
][
-
1
])
):
for
k
,
v
in
sorted
(
s2
.
items
(),
key
=
keysort
):
if
k
.
endswith
(
'+'
):
if
k
.
endswith
(
'+'
):
key
=
k
.
rstrip
(
' +'
)
key
=
k
.
rstrip
(
' +'
)
implicit_value
=
SectionKey
(
""
,
"IMPLICIT_VALUE"
)
value
=
s1
.
get
(
key
,
SectionKey
(
""
,
"IMPLICIT_VALUE"
))
# Find v1 in s2 first; it may have been defined locally too.
value
.
addToValue
(
v
.
value
,
v
.
source
)
section_key
=
s2
.
get
(
key
,
s1
.
get
(
key
,
implicit_value
))
s1
[
key
]
=
value
section_key
=
copy
.
deepcopy
(
section_key
)
section_key
.
addToValue
(
v
.
value
,
v
.
source
)
s2
[
key
]
=
section_key
del
s2
[
k
]
elif
k
.
endswith
(
'-'
):
elif
k
.
endswith
(
'-'
):
key
=
k
.
rstrip
(
' -'
)
key
=
k
.
rstrip
(
' -'
)
implicit_value
=
SectionKey
(
""
,
"IMPLICIT_VALUE"
)
value
=
s1
.
get
(
key
,
SectionKey
(
""
,
"IMPLICIT_VALUE"
))
# Find v1 in s2 first; it may have been set by a += operation first
value
.
removeFromValue
(
v
.
value
,
v
.
source
)
section_key
=
s2
.
get
(
key
,
s1
.
get
(
key
,
implicit_value
))
s1
[
key
]
=
value
section_key
=
copy
.
deepcopy
(
section_key
)
else
:
section_key
.
removeFromValue
(
v
.
value
,
v
.
source
)
if
k
in
s1
:
s2
[
key
]
=
section_key
v1
=
s1
[
k
]
del
s2
[
k
]
v1
.
overrideValue
(
v
)
_update_verbose
(
s1
,
s2
)
return
s1
def
_update_verbose
(
s1
,
s2
):
for
key
,
v2
in
s2
.
items
():
if
key
in
s1
:
v1
=
s1
[
key
]
v1
.
overrideValue
(
v2
)
else
:
else
:
s1
[
key
]
=
copy
.
deepcopy
(
v2
)
s1
[
k
]
=
v
return
s1
def
_update
(
in1
,
d2
):
def
_update
(
d1
,
d2
):
d1
=
copy
.
deepcopy
(
in1
)
for
section
in
d2
:
for
section
in
d2
:
if
section
in
d1
:
if
section
in
d1
:
d1
[
section
]
=
_update_section
(
d1
[
section
],
d2
[
section
])
_update_section
(
d1
[
section
],
d2
[
section
])
else
:
else
:
d1
[
section
]
=
copy
.
deepcopy
(
d2
[
section
])
# XXX: In order to process += (and -=) correctly when
# <key> = <value> and <key> += <value> are in the same section
# _update_section should be called even when section is not in d1
# Hack: When <= is used in the section, _update_section will be
# called later anyway, so we can avoid calling it now which will
# enable brittle and partial support for += (or -=) with keys that
# come from the <= sections.
# TODO: Either implement += and -= support with <= fully or call
# _update_section systematically and give up <= compatibility.
s2
=
d2
[
section
]
d1
[
section
]
=
s2
if
'<'
in
s2
else
_update_section
({},
s2
)
return
d1
return
d1
def
_recipe
(
options
):
def
_recipe
(
options
):
...
@@ -2095,6 +2458,12 @@ Options:
...
@@ -2095,6 +2458,12 @@ Options:
Print buildout version number and exit.
Print buildout version number and exit.
--dry-run
Dry-run mode. With this setting, buildout will display what will
be uninstalled and what will be installed without doing anything
in reality.
Assignments are of the form: section:option=value and are used to
Assignments are of the form: section:option=value and are used to
provide configuration options that override those given in the
provide configuration options that override those given in the
configuration file. For example, to run the buildout in offline mode,
configuration file. For example, to run the buildout in offline mode,
...
@@ -2214,13 +2583,16 @@ def main(args=None):
...
@@ -2214,13 +2583,16 @@ def main(args=None):
_error
(
"No timeout value specified for option"
,
orig_op
)
_error
(
"No timeout value specified for option"
,
orig_op
)
except
ValueError
:
except
ValueError
:
_error
(
"Timeout value must be numeric"
,
orig_op
)
_error
(
"Timeout value must be numeric"
,
orig_op
)
elif
orig_op
==
'--dry-run'
:
options
.
append
((
'buildout'
,
'dry-run'
,
'true'
))
elif
op
:
elif
op
:
if
orig_op
==
'--help'
:
if
orig_op
==
'--help'
:
_help
()
_help
()
elif
orig_op
==
'--version'
:
elif
orig_op
==
'--version'
:
_version
()
_version
()
else
:
else
:
_error
(
"Invalid option"
,
'-'
+
op
[
0
]
)
_error
(
"Invalid option"
,
orig_op
)
elif
'='
in
args
[
0
]:
elif
'='
in
args
[
0
]:
option
,
value
=
args
.
pop
(
0
).
split
(
'='
,
1
)
option
,
value
=
args
.
pop
(
0
).
split
(
'='
,
1
)
option
=
option
.
split
(
':'
)
option
=
option
.
split
(
':'
)
...
...
src/zc/buildout/configparser.py
View file @
3b16f5af
...
@@ -113,10 +113,12 @@ section_header = re.compile(
...
@@ -113,10 +113,12 @@ section_header = re.compile(
r'
([
#;].*)?$)'
r'
([
#;].*)?$)'
).
match
).
match
option_name_re
=
r'[^\
s{}[
\]=:]+'
option_start
=
re
.
compile
(
option_start
=
re
.
compile
(
r'(?P<name>
[^\
s{}[
\]=:]+
\
s*[-+]?)
'
r'(?P<name>
%s
\
s*[-+]?)
'
r'
=
'
r'
=
'
r'
(
?
P
<
value
>
.
*
)
$
').match
r'
(
?
P
<
value
>
.
*
)
$
'
% option_name_re).match
leading_blank_lines = re.compile(r"^(
\
s*
\
n)+")
leading_blank_lines = re.compile(r"^(
\
s*
\
n)+")
...
@@ -201,7 +203,12 @@ def parse(fp, fpname, exp_globals=dict):
...
@@ -201,7 +203,12 @@ def parse(fp, fpname, exp_globals=dict):
if not context:
if not context:
context = exp_globals()
context = exp_globals()
# evaluated expression is in list: get first element
# evaluated expression is in list: get first element
try:
section_condition = eval(expr, context)[0]
section_condition = eval(expr, context)[0]
except NameError as x:
sections.setdefault(sectname, {})[
'__unsupported_conditional_expression__'] = x
continue
# finally, ignore section when an expression
# finally, ignore section when an expression
# evaluates to false
# evaluates to false
if not section_condition:
if not section_condition:
...
@@ -255,6 +262,8 @@ def parse(fp, fpname, exp_globals=dict):
...
@@ -255,6 +262,8 @@ def parse(fp, fpname, exp_globals=dict):
section = sections[sectname]
section = sections[sectname]
for name in section:
for name in section:
value = section[name]
value = section[name]
if isinstance(value, NameError):
continue
if value[:1].isspace():
if value[:1].isspace():
section[name] = leading_blank_lines.sub(
section[name] = leading_blank_lines.sub(
'', textwrap.dedent(value.rstrip()))
'', textwrap.dedent(value.rstrip()))
...
...
src/zc/buildout/download.py
View file @
3b16f5af
...
@@ -20,44 +20,47 @@ except ImportError:
...
@@ -20,44 +20,47 @@ except ImportError:
try
:
try
:
# Python 3
# Python 3
from
urllib.request
import
urlretrieve
from
urllib.error
import
HTTPError
from
urllib.parse
import
urlparse
from
urllib.request
import
Request
,
urlopen
from
urllib.parse
import
urlparse
,
urlunparse
except
ImportError
:
except
ImportError
:
# Python 2
# Python 2
import
base64
from
urlparse
import
urlparse
from
urlparse
import
urlparse
from
urlparse
import
urlunparse
from
urlparse
import
urlunparse
import
urllib2
from
urllib2
import
HTTPError
,
Request
,
urlopen
def
urlretrieve
(
url
,
tmp_path
):
"""Work around Python issue 24599 including basic auth support
"""
scheme
,
netloc
,
path
,
params
,
query
,
frag
=
urlparse
(
url
)
auth
,
host
=
urllib2
.
splituser
(
netloc
)
if
auth
:
url
=
urlunparse
((
scheme
,
host
,
path
,
params
,
query
,
frag
))
req
=
urllib2
.
Request
(
url
)
base64string
=
base64
.
encodestring
(
auth
)[:
-
1
]
basic
=
"Basic "
+
base64string
req
.
add_header
(
"Authorization"
,
basic
)
else
:
req
=
urllib2
.
Request
(
url
)
url_obj
=
urllib2
.
urlopen
(
req
)
with
open
(
tmp_path
,
'wb'
)
as
fp
:
fp
.
write
(
url_obj
.
read
())
return
tmp_path
,
url_obj
.
info
()
from
zc.buildout.easy_install
import
realpath
from
zc.buildout.easy_install
import
realpath
from
base64
import
b64encode
from
contextlib
import
closing
import
errno
import
logging
import
logging
import
netrc
import
os
import
os
import
os.path
import
os.path
import
re
import
re
import
shutil
import
shutil
import
sys
import
tempfile
import
tempfile
import
zc.buildout
import
zc.buildout
from
.
import
bytes2str
,
str2bytes
from
.rmtree
import
rmtree
class
netrc
(
netrc
.
netrc
):
def
__init__
(
*
args
):
pass
def
authenticators
(
self
,
host
):
self
.
__class__
,
=
self
.
__class__
.
__bases__
try
:
self
.
__init__
()
except
IOError
as
e
:
if
e
.
errno
!=
errno
.
ENOENT
:
raise
self
.
__init__
(
os
.
devnull
)
return
self
.
authenticators
(
host
)
netrc
=
netrc
()
class
ChecksumError
(
zc
.
buildout
.
UserError
):
class
ChecksumError
(
zc
.
buildout
.
UserError
):
pass
pass
...
@@ -74,7 +77,8 @@ class Download(object):
...
@@ -74,7 +77,8 @@ class Download(object):
cache: path to the download cache (excluding namespaces)
cache: path to the download cache (excluding namespaces)
namespace: namespace directory to use inside the cache
namespace: namespace directory to use inside the cache
offline: whether to operate in offline mode
offline: whether to operate in offline mode
fallback: whether to use the cache as a fallback (try downloading first)
fallback: whether to use the cache as a fallback (try downloading first),
when an MD5 checksum is not given
hash_name: whether to use a hash of the URL as cache file name
hash_name: whether to use a hash of the URL as cache file name
logger: an optional logger to receive download-related log messages
logger: an optional logger to receive download-related log messages
...
@@ -107,7 +111,8 @@ class Download(object):
...
@@ -107,7 +111,8 @@ class Download(object):
if
self
.
download_cache
is
not
None
:
if
self
.
download_cache
is
not
None
:
return
os
.
path
.
join
(
self
.
download_cache
,
self
.
namespace
or
''
)
return
os
.
path
.
join
(
self
.
download_cache
,
self
.
namespace
or
''
)
def
__call__
(
self
,
url
,
md5sum
=
None
,
path
=
None
):
@
property
def
__call__
(
self
):
"""Download a file according to the utility's configuration.
"""Download a file according to the utility's configuration.
url: URL to download
url: URL to download
...
@@ -117,19 +122,14 @@ class Download(object):
...
@@ -117,19 +122,14 @@ class Download(object):
Returns the path to the downloaded file.
Returns the path to the downloaded file.
"""
"""
if
self
.
cache
:
return
self
.
download_cached
if
self
.
cache
else
self
.
download
local_path
,
is_temp
=
self
.
download_cached
(
url
,
md5sum
)
else
:
local_path
,
is_temp
=
self
.
download
(
url
,
md5sum
,
path
)
return
locate_at
(
local_path
,
path
),
is_temp
def
download_cached
(
self
,
url
,
md5sum
=
None
,
path
=
None
,
alternate_url
=
None
):
def
download_cached
(
self
,
url
,
md5sum
=
None
):
"""Download a file from a URL using the cache.
"""Download a file from a URL using the cache.
This method assumes that the cache has been configured.
Optionally, it
This method assumes that the cache has been configured.
raises a ChecksumError if a cached copy of a file has an MD5 mismatch,
If a cached copy of a file has an MD5 mismatch, download
but will not remove the copy in that case
.
and update the cache on success
.
"""
"""
if
not
os
.
path
.
exists
(
self
.
download_cache
):
if
not
os
.
path
.
exists
(
self
.
download_cache
):
...
@@ -146,28 +146,45 @@ class Download(object):
...
@@ -146,28 +146,45 @@ class Download(object):
self
.
logger
.
debug
(
'Searching cache at %s'
%
cache_dir
)
self
.
logger
.
debug
(
'Searching cache at %s'
%
cache_dir
)
if
os
.
path
.
exists
(
cached_path
):
if
os
.
path
.
exists
(
cached_path
):
is_temp
=
False
if
check_md5sum
(
cached_path
,
md5sum
):
if
self
.
fallback
:
if
md5sum
or
not
self
.
fallback
:
self
.
logger
.
debug
(
'Using cache file %s'
,
cached_path
)
return
locate_at
(
cached_path
,
path
),
False
else
:
self
.
logger
.
warning
(
'MD5 checksum mismatch for cached download from %r at %r'
,
url
,
cached_path
)
# Don't download directly to cached_path to minimize
# the probability to alter old data if download fails.
try
:
try
:
_
,
is_temp
=
self
.
download
(
url
,
md5sum
,
cached_path
)
path
,
is_temp
=
self
.
download
(
url
,
md5sum
,
path
,
alternate_url
)
except
ChecksumError
:
except
ChecksumError
:
raise
raise
except
Exception
:
except
Exception
:
pass
if
md5sum
:
raise
if
not
check_md5sum
(
cached_path
,
md5sum
):
self
.
logger
.
debug
(
"Fallback to cache using %s"
,
raise
ChecksumError
(
cached_path
,
exception
=
1
)
'MD5 checksum mismatch for cached download '
else
:
'from %r at %r'
%
(
url
,
cached_path
))
samefile
=
getattr
(
os
.
path
,
'samefile'
,
None
)
self
.
logger
.
debug
(
'Using cache file %s'
%
cached_path
)
if
not
(
samefile
and
samefile
(
path
,
cached_path
)):
# update cache
try
:
os
.
remove
(
cached_path
)
except
OSError
as
e
:
if
e
.
errno
!=
errno
.
EISDIR
:
raise
rmtree
(
cached_path
)
locate_at
(
path
,
cached_path
)
return
path
,
is_temp
else
:
else
:
self
.
logger
.
debug
(
'Cache miss; will cache %s as %s'
%
self
.
logger
.
debug
(
'Cache miss; will cache %s as %s'
%
(
url
,
cached_path
))
(
url
,
cached_path
))
_
,
is_temp
=
self
.
download
(
url
,
md5sum
,
cached_path
)
self
.
download
(
url
,
md5sum
,
cached_path
,
alternate_url
)
return
cached_path
,
is_temp
return
locate_at
(
cached_path
,
path
),
False
def
download
(
self
,
url
,
md5sum
=
None
,
path
=
None
):
def
download
(
self
,
url
,
md5sum
=
None
,
path
=
None
,
alternate_url
=
None
):
"""Download a file from a URL to a given or temporary path.
"""Download a file from a URL to a given or temporary path.
An online resource is always downloaded to a temporary file and moved
An online resource is always downloaded to a temporary file and moved
...
@@ -196,27 +213,38 @@ class Download(object):
...
@@ -196,27 +213,38 @@ class Download(object):
"Couldn't download %r in offline mode."
%
url
)
"Couldn't download %r in offline mode."
%
url
)
self
.
logger
.
info
(
'Downloading %s'
%
url
)
self
.
logger
.
info
(
'Downloading %s'
%
url
)
download_url
=
url
tmp_path
=
path
cleanup
=
True
try
:
if
not
path
:
handle
,
tmp_path
=
tempfile
.
mkstemp
(
prefix
=
'buildout-'
)
handle
,
tmp_path
=
tempfile
.
mkstemp
(
prefix
=
'buildout-'
)
os
.
close
(
handle
)
os
.
close
(
handle
)
self
.
_download
(
url
,
tmp_path
,
md5sum
,
alternate_url
)
cleanup
=
False
finally
:
if
cleanup
and
tmp_path
:
remove
(
tmp_path
)
return
tmp_path
,
not
path
def
_download
(
self
,
url
,
path
,
md5sum
=
None
,
alternate_url
=
None
):
download_url
=
url
try
:
try
:
tmp_path
,
headers
=
urlretrieve
(
url
,
tmp_path
)
try
:
if
not
check_md5sum
(
tmp_path
,
md5sum
):
self
.
urlretrieve
(
url
,
path
)
raise
ChecksumError
(
except
HTTPError
:
'MD5 checksum mismatch downloading %r'
%
url
)
if
not
alternate_url
:
except
IOError
:
e
=
sys
.
exc_info
()[
1
]
os
.
remove
(
tmp_path
)
raise
zc
.
buildout
.
UserError
(
"Error downloading extends for URL "
"%s: %s"
%
(
url
,
e
))
except
Exception
:
os
.
remove
(
tmp_path
)
raise
raise
self
.
logger
.
info
(
'using alternate URL: %s'
,
alternate_url
)
if
path
:
download_url
=
alternate_url
shutil
.
move
(
tmp_path
,
path
)
self
.
urlretrieve
(
alternate_url
,
path
)
return
path
,
False
if
not
check_md5sum
(
path
,
md5sum
):
else
:
raise
ChecksumError
(
'MD5 checksum mismatch downloading %r'
return
tmp_path
,
True
%
download_url
)
except
IOError
as
e
:
raise
zc
.
buildout
.
UserError
(
"Error downloading %s: %s"
%
(
download_url
,
e
))
def
filename
(
self
,
url
):
def
filename
(
self
,
url
):
"""Determine a file name from a URL according to the configuration.
"""Determine a file name from a URL according to the configuration.
...
@@ -245,6 +273,60 @@ class Download(object):
...
@@ -245,6 +273,60 @@ class Download(object):
url_host
,
url_port
=
parsed
[
-
2
:]
url_host
,
url_port
=
parsed
[
-
2
:]
return
'%s:%s'
%
(
url_host
,
url_port
)
return
'%s:%s'
%
(
url_host
,
url_port
)
def
_auth
(
self
,
url
):
parsed_url
=
urlparse
(
url
)
if
parsed_url
.
scheme
in
(
'http'
,
'https'
):
auth_host
=
parsed_url
.
netloc
.
rsplit
(
'@'
,
1
)
if
len
(
auth_host
)
>
1
:
return
(
auth_host
[
0
],
parsed_url
.
_replace
(
netloc
=
auth_host
[
1
]).
geturl
())
auth
=
netrc
.
authenticators
(
parsed_url
.
hostname
)
if
auth
:
return
'{0}:{2}'
.
format
(
*
auth
),
url
def
urlretrieve
(
self
,
url
,
tmp_path
):
auth
=
self
.
_auth
(
url
)
if
auth
:
req
=
Request
(
auth
[
1
])
req
.
add_header
(
"Authorization"
,
"Basic "
+
bytes2str
(
b64encode
(
str2bytes
(
auth
[
0
]))))
else
:
req
=
url
with
closing
(
urlopen
(
req
))
as
src
:
with
open
(
tmp_path
,
'wb'
)
as
dst
:
shutil
.
copyfileobj
(
src
,
dst
)
return
tmp_path
,
src
.
info
()
class
Download
(
Download
):
def
_download
(
self
,
url
,
path
,
md5sum
=
None
,
alternate_url
=
None
):
from
.buildout
import
networkcache_client
as
nc
while
nc
:
# not a loop
if
self
.
_auth
(
url
):
# do not cache restricted data
nc
=
None
break
key
=
'file-urlmd5:'
+
md5
(
url
.
encode
()).
hexdigest
()
if
not
nc
.
tryDownload
(
key
):
break
with
nc
:
entry
=
next
(
nc
.
select
(
key
,
{
'url'
:
url
}),
None
)
if
entry
is
None
:
err
=
'no matching entry'
else
:
with
closing
(
nc
.
download
(
entry
[
'sha512'
]))
as
src
,
\
open
(
path
,
'wb'
)
as
dst
:
shutil
.
copyfileobj
(
src
,
dst
)
if
check_md5sum
(
path
,
md5sum
):
return
err
=
'MD5 checksum mismatch'
self
.
logger
.
info
(
'Cannot download from network cache: %s'
,
err
)
break
super
(
Download
,
self
).
_download
(
url
,
path
,
md5sum
,
alternate_url
)
if
nc
and
nc
.
tryUpload
(
key
):
with
nc
,
open
(
path
,
'rb'
)
as
f
:
nc
.
upload
(
f
,
key
,
url
=
url
)
def
check_md5sum
(
path
,
md5sum
):
def
check_md5sum
(
path
,
md5sum
):
"""Tell whether the MD5 checksum of the file at path matches.
"""Tell whether the MD5 checksum of the file at path matches.
...
...
src/zc/buildout/easy_install.py
View file @
3b16f5af
...
@@ -18,8 +18,10 @@ It doesn't install scripts. It uses setuptools and requires it to be
...
@@ -18,8 +18,10 @@ It doesn't install scripts. It uses setuptools and requires it to be
installed.
installed.
"""
"""
import
atexit
import
copy
import
copy
import
distutils.errors
import
distutils.errors
import
distutils.sysconfig
import
errno
import
errno
import
glob
import
glob
import
logging
import
logging
...
@@ -33,6 +35,7 @@ import setuptools.command.easy_install
...
@@ -33,6 +35,7 @@ import setuptools.command.easy_install
import
setuptools.command.setopt
import
setuptools.command.setopt
import
setuptools.package_index
import
setuptools.package_index
import
shutil
import
shutil
import
stat
import
subprocess
import
subprocess
import
sys
import
sys
import
tempfile
import
tempfile
...
@@ -41,6 +44,8 @@ import zc.buildout.rmtree
...
@@ -41,6 +44,8 @@ import zc.buildout.rmtree
from
zc.buildout
import
WINDOWS
from
zc.buildout
import
WINDOWS
from
zc.buildout
import
PY3
from
zc.buildout
import
PY3
import
warnings
import
warnings
from
contextlib
import
closing
from
setuptools.package_index
import
distros_for_location
,
URL_SCHEME
import
csv
import
csv
try
:
try
:
...
@@ -56,6 +61,8 @@ except ImportError:
...
@@ -56,6 +61,8 @@ except ImportError:
BIN_SCRIPTS
=
'Scripts'
if
WINDOWS
else
'bin'
BIN_SCRIPTS
=
'Scripts'
if
WINDOWS
else
'bin'
WHL_DIST
=
pkg_resources
.
EGG_DIST
+
1
warnings
.
filterwarnings
(
warnings
.
filterwarnings
(
'ignore'
,
'.+is being parsed as a legacy, non PEP 440, version'
)
'ignore'
,
'.+is being parsed as a legacy, non PEP 440, version'
)
...
@@ -77,6 +84,9 @@ is_source_encoding_line = re.compile(r'coding[:=]\s*([-\w.]+)').search
...
@@ -77,6 +84,9 @@ is_source_encoding_line = re.compile(r'coding[:=]\s*([-\w.]+)').search
is_win32
=
sys
.
platform
==
'win32'
is_win32
=
sys
.
platform
==
'win32'
is_jython
=
sys
.
platform
.
startswith
(
'java'
)
is_jython
=
sys
.
platform
.
startswith
(
'java'
)
PATCH_MARKER
=
'SlapOSPatched'
orig_versions_re
=
re
.
compile
(
r'[+\
-]%s
\d+'
%
PATCH_MARKER
)
if
is_jython
:
if
is_jython
:
import
java.lang.System
import
java.lang.System
jython_os_name
=
(
java
.
lang
.
System
.
getProperties
()[
'os.name'
]).
lower
()
jython_os_name
=
(
java
.
lang
.
System
.
getProperties
()[
'os.name'
]).
lower
()
...
@@ -91,17 +101,24 @@ if has_distribute and not has_setuptools:
...
@@ -91,17 +101,24 @@ if has_distribute and not has_setuptools:
sys
.
exit
(
"zc.buildout 3 needs setuptools, not distribute."
sys
.
exit
(
"zc.buildout 3 needs setuptools, not distribute."
"Did you properly install with pip in a virtualenv ?"
)
"Did you properly install with pip in a virtualenv ?"
)
# Include buildout and setuptools eggs in paths. We get this
# XXX Take care to respect the sys.path order, as otherwise other
# initially from the entire working set. Later, we'll use the install
# distributions for pip, wheel and setuptools may take precedence
# function to narrow to just the buildout and setuptools paths.
# over the ones currently running.
buildout_and_setuptools_path
=
sorted
({
d
.
location
for
d
in
pkg_resources
.
working_set
})
pip_path
=
setuptools_path
=
[
setuptools_path
=
buildout_and_setuptools_path
dist
.
location
pip_path
=
buildout_and_setuptools_path
for
dist
in
pkg_resources
.
working_set
logger
.
debug
(
'before restricting versions: pip_path %r'
,
pip_path
)
if
dist
.
project_name
in
(
'pip'
,
'wheel'
,
'setuptools'
)
]
pip_pythonpath
=
setuptools_pythonpath
=
os
.
pathsep
.
join
(
pip_path
)
python_lib
=
distutils
.
sysconfig
.
get_python_lib
()
FILE_SCHEME
=
re
.
compile
(
'file://'
,
re
.
I
).
match
FILE_SCHEME
=
re
.
compile
(
'file://'
,
re
.
I
).
match
DUNDER_FILE_PATTERN
=
re
.
compile
(
r"__file__ = '(?P<filename>.+)'$"
)
DUNDER_FILE_PATTERN
=
re
.
compile
(
r"__file__ = '(?P<filename>.+)'$"
)
networkcache_key
=
'pypi:{}={}'
.
format
class
_Monkey
(
object
):
class
_Monkey
(
object
):
def
__init__
(
self
,
module
,
**
kw
):
def
__init__
(
self
,
module
,
**
kw
):
mdict
=
self
.
_mdict
=
module
.
__dict__
mdict
=
self
.
_mdict
=
module
.
__dict__
...
@@ -140,7 +157,7 @@ class AllowHostsPackageIndex(setuptools.package_index.PackageIndex):
...
@@ -140,7 +157,7 @@ class AllowHostsPackageIndex(setuptools.package_index.PackageIndex):
_indexes
=
{}
_indexes
=
{}
def
_get_index
(
index_url
,
find_links
,
allow_hosts
=
(
'*'
,)):
def
_get_index
(
index_url
,
find_links
,
allow_hosts
=
(
'*'
,)):
key
=
index_url
,
tuple
(
find_links
)
key
=
index_url
,
tuple
(
find_links
)
,
allow_hosts
index
=
_indexes
.
get
(
key
)
index
=
_indexes
.
get
(
key
)
if
index
is
not
None
:
if
index
is
not
None
:
return
index
return
index
...
@@ -165,7 +182,12 @@ if is_win32:
...
@@ -165,7 +182,12 @@ if is_win32:
def
_safe_arg
(
arg
):
def
_safe_arg
(
arg
):
return
'"%s"'
%
arg
return
'"%s"'
%
arg
else
:
else
:
_safe_arg
=
str
def
_safe_arg
(
arg
):
if
len
(
arg
)
<
126
:
return
arg
else
:
# Workaround for the shebang line length limitation.
return
'/bin/sh
\
n
"exec" "%s" "$0" "$@"'
%
arg
def
call_subprocess
(
args
,
**
kw
):
def
call_subprocess
(
args
,
**
kw
):
if
subprocess
.
call
(
args
,
**
kw
)
!=
0
:
if
subprocess
.
call
(
args
,
**
kw
)
!=
0
:
...
@@ -229,6 +251,10 @@ def dist_needs_pkg_resources(dist):
...
@@ -229,6 +251,10 @@ def dist_needs_pkg_resources(dist):
)
)
_doing_list
=
type
(
''
,
(),
{
'__mod__'
:
staticmethod
(
lambda
x
:
'
\
n
'
.
join
(
*
x
))})()
class
Installer
(
object
):
class
Installer
(
object
):
_versions
=
{}
_versions
=
{}
...
@@ -241,6 +267,7 @@ class Installer(object):
...
@@ -241,6 +267,7 @@ class Installer(object):
_allow_picked_versions
=
True
_allow_picked_versions
=
True
_store_required_by
=
False
_store_required_by
=
False
_allow_unknown_extras
=
False
_allow_unknown_extras
=
False
_extra_paths
=
[]
def
__init__
(
self
,
def
__init__
(
self
,
dest
=
None
,
dest
=
None
,
...
@@ -275,7 +302,7 @@ class Installer(object):
...
@@ -275,7 +302,7 @@ class Installer(object):
links
.
insert
(
0
,
self
.
_download_cache
)
links
.
insert
(
0
,
self
.
_download_cache
)
self
.
_index_url
=
index
self
.
_index_url
=
index
path
=
(
path
and
path
[:]
or
[])
+
buildout_and_setuptools_path
path
=
(
path
and
path
[:]
or
[])
+
self
.
_extra_paths
self
.
_path
=
path
self
.
_path
=
path
if
self
.
_dest
is
None
:
if
self
.
_dest
is
None
:
newest
=
False
newest
=
False
...
@@ -289,36 +316,15 @@ class Installer(object):
...
@@ -289,36 +316,15 @@ class Installer(object):
self
.
_versions
=
normalize_versions
(
versions
)
self
.
_versions
=
normalize_versions
(
versions
)
def
_make_env
(
self
):
def
_make_env
(
self
):
full_path
=
self
.
_get_dest_dist_paths
()
+
self
.
_path
dest
=
self
.
_dest
env
=
pkg_resources
.
Environment
(
full_path
)
full_path
=
[]
if
dest
is
None
else
[
dest
]
# this needs to be called whenever self._env is modified (or we could
full_path
.
extend
(
self
.
_path
)
# make an Environment subclass):
return
pkg_resources
.
Environment
(
full_path
)
self
.
_eggify_env_dest_dists
(
env
,
self
.
_dest
)
return
env
def
_env_rescan_dest
(
self
):
def
_env_rescan_dest
(
self
):
self
.
_env
.
scan
(
self
.
_get_dest_dist_paths
())
self
.
_eggify_env_dest_dists
(
self
.
_env
,
self
.
_dest
)
def
_get_dest_dist_paths
(
self
):
dest
=
self
.
_dest
dest
=
self
.
_dest
if
dest
is
None
:
if
dest
is
not
None
:
return
[]
self
.
_env
.
scan
([
dest
])
eggs
=
glob
.
glob
(
os
.
path
.
join
(
dest
,
'*.egg'
))
dists
=
[
os
.
path
.
dirname
(
dist_info
)
for
dist_info
in
glob
.
glob
(
os
.
path
.
join
(
dest
,
'*'
,
'*.dist-info'
))]
return
list
(
set
(
eggs
+
dists
))
@
staticmethod
def
_eggify_env_dest_dists
(
env
,
dest
):
"""
Make sure everything found under `dest` is seen as an egg, even if it's
some other kind of dist.
"""
for
project_name
in
env
:
for
dist
in
env
[
project_name
]:
if
os
.
path
.
dirname
(
dist
.
location
)
==
dest
:
dist
.
precedence
=
pkg_resources
.
EGG_DIST
def
_version_conflict_information
(
self
,
name
):
def
_version_conflict_information
(
self
,
name
):
"""Return textual requirements/constraint information for debug purposes
"""Return textual requirements/constraint information for debug purposes
...
@@ -423,11 +429,11 @@ class Installer(object):
...
@@ -423,11 +429,11 @@ class Installer(object):
str
(
req
))
str
(
req
))
return
best_we_have
,
None
return
best_we_have
,
None
def
_call_pip_
instal
l
(
self
,
spec
,
dest
,
dist
):
def
_call_pip_
whee
l
(
self
,
spec
,
dest
,
dist
):
tmp
=
tempfile
.
mkdtemp
(
dir
=
dest
)
tmp
=
tempfile
.
mkdtemp
(
dir
=
dest
)
try
:
try
:
paths
=
call_pip_
install
(
spec
,
tmp
)
paths
=
call_pip_
wheel
(
spec
,
tmp
,
self
)
dists
=
[]
dists
=
[]
env
=
pkg_resources
.
Environment
(
paths
)
env
=
pkg_resources
.
Environment
(
paths
)
...
@@ -459,30 +465,78 @@ class Installer(object):
...
@@ -459,30 +465,78 @@ class Installer(object):
result
=
[]
result
=
[]
for
d
in
dists
:
for
d
in
dists
:
result
.
append
(
_move_to_eggs_dir_and_compile
(
d
,
dest
))
result
.
append
(
_move_to_eggs_dir_and_compile
(
d
,
dest
,
self
))
return
result
return
result
finally
:
finally
:
zc
.
buildout
.
rmtree
.
rmtree
(
tmp
)
zc
.
buildout
.
rmtree
.
rmtree
(
tmp
)
def
_obtain
(
self
,
requirement
,
source
=
None
):
def
_obtain
(
self
,
requirement
,
source
=
None
,
networkcache_failed
=
False
):
# initialize out index for this project:
# get the non-patched version
req
=
str
(
requirement
)
if
PATCH_MARKER
in
req
:
requirement
=
pkg_resources
.
Requirement
.
parse
(
re
.
sub
(
orig_versions_re
,
''
,
req
))
wheel
=
getattr
(
requirement
,
'wheel'
,
False
)
def
filter_precedence
(
dist
):
return
(
dist
.
precedence
==
WHL_DIST
)
==
wheel
and
(
dist
.
precedence
==
pkg_resources
.
SOURCE_DIST
if
source
else
not
(
dist
.
precedence
==
pkg_resources
.
DEVELOP_DIST
and
{
'setup.py'
,
'pyproject.toml'
}.
isdisjoint
(
os
.
listdir
(
dist
.
location
)))
)
index
=
self
.
_index
index
=
self
.
_index
if
not
networkcache_failed
:
try
:
(
operator
,
version
,),
=
requirement
.
specs
except
ValueError
:
pass
else
:
# Network cache is not expected to contain all versions so it
# couldn't tell whether a found version is the best existing
# one. Therefore, it's only accessed when we have a
# specification for a single version, which is anyway enough
# for our usage (picked versions not allowed).
if
operator
==
'=='
:
# But first, avoid any network access by checking local
# urls. PackageIndex.add_find_links scans them immediately.
dists
=
[
dist
for
dist
in
index
[
requirement
.
project_name
]
if
dist
in
requirement
and
filter_precedence
(
dist
)
and
(
FILE_SCHEME
(
dist
.
location
)
or
not
URL_SCHEME
(
dist
.
location
))]
if
dists
:
return
max
(
dists
)
from
.buildout
import
networkcache_client
as
nc
if
nc
:
key
=
networkcache_key
(
requirement
.
key
,
version
)
if
nc
.
tryDownload
(
key
):
with
nc
:
for
entry
in
nc
.
select
(
key
):
basename
=
entry
[
'basename'
]
for
dist
in
distros_for_location
(
entry
[
'sha512'
],
basename
):
# The version comparison is to keep
# the one that's correctly parsed by
# distros_for_location.
if
(
dist
.
version
==
version
and
self
.
_env
.
can_add
(
dist
)
and
filter_precedence
(
dist
)):
dist
.
networkcache
=
(
basename
,
requirement
,
source
)
dists
.
append
(
dist
)
if
dists
:
return
max
(
dists
)
# initialize out index for this project:
if
index
.
obtain
(
requirement
)
is
None
:
if
index
.
obtain
(
requirement
)
is
None
:
# Nothing is available.
# Nothing is available.
return
None
return
None
# Filter the available dists for the requirement and source flag
# Filter the available dists for the requirement and source flag
dists
=
[
dist
for
dist
in
index
[
requirement
.
project_name
]
dists
=
[
dist
for
dist
in
index
[
requirement
.
project_name
]
if
((
dist
in
requirement
)
if
dist
in
requirement
and
filter_precedence
(
dist
)]
and
((
not
source
)
or
(
dist
.
precedence
==
pkg_resources
.
SOURCE_DIST
)
)
)
]
# If we prefer final dists, filter for final and use the
# If we prefer final dists, filter for final and use the
# result if it is non empty.
# result if it is non empty.
...
@@ -519,10 +573,26 @@ class Installer(object):
...
@@ -519,10 +573,26 @@ class Installer(object):
):
):
return
dist
return
dist
best
.
sort
()
return
max
(
best
)
return
best
[
-
1
]
def
_fetch
(
self
,
dist
,
tmp
,
download_cache
):
def
_fetch
(
self
,
dist
,
tmp
,
download_cache
):
from
.buildout
import
networkcache_client
as
nc
while
hasattr
(
dist
,
'networkcache'
):
basename
,
requirement
,
source
=
dist
.
networkcache
new_location
=
os
.
path
.
join
(
tmp
,
basename
)
with
nc
,
closing
(
nc
.
download
(
dist
.
location
))
as
src
,
\
open
(
new_location
,
'wb'
)
as
dst
:
shutil
.
copyfileobj
(
src
,
dst
)
break
# Downloading content from network cache failed: let's resume index
# lookup to get a fallback url. This will respect _satisfied()
# decision because the specification is for a single version.
dist
=
self
.
_obtain
(
requirement
,
source
,
networkcache_failed
=
True
)
if
dist
is
None
:
raise
zc
.
buildout
.
UserError
(
"Couldn't find a distribution for %r."
%
str
(
requirement
))
else
:
if
(
download_cache
if
(
download_cache
and
(
realpath
(
os
.
path
.
dirname
(
dist
.
location
))
==
download_cache
)
and
(
realpath
(
os
.
path
.
dirname
(
dist
.
location
))
==
download_cache
)
):
):
...
@@ -531,6 +601,13 @@ class Installer(object):
...
@@ -531,6 +601,13 @@ class Installer(object):
logger
.
debug
(
"Fetching %s from: %s"
,
dist
,
dist
.
location
)
logger
.
debug
(
"Fetching %s from: %s"
,
dist
,
dist
.
location
)
new_location
=
self
.
_index
.
download
(
dist
.
location
,
tmp
)
new_location
=
self
.
_index
.
download
(
dist
.
location
,
tmp
)
if
nc
:
key
=
networkcache_key
(
dist
.
key
,
dist
.
version
)
if
nc
.
tryUpload
(
key
):
with
nc
,
open
(
new_location
,
'rb'
)
as
f
:
nc
.
upload
(
f
,
key
,
basename
=
os
.
path
.
basename
(
new_location
))
if
(
download_cache
if
(
download_cache
and
(
realpath
(
new_location
)
==
realpath
(
dist
.
location
))
and
(
realpath
(
new_location
)
==
realpath
(
dist
.
location
))
and
os
.
path
.
isfile
(
new_location
)
and
os
.
path
.
isfile
(
new_location
)
...
@@ -578,7 +655,7 @@ class Installer(object):
...
@@ -578,7 +655,7 @@ class Installer(object):
raise
zc
.
buildout
.
UserError
(
raise
zc
.
buildout
.
UserError
(
"Couldn't download distribution %s."
%
avail
)
"Couldn't download distribution %s."
%
avail
)
dists
=
[
_move_to_eggs_dir_and_compile
(
dist
,
self
.
_dest
)]
dists
=
[
_move_to_eggs_dir_and_compile
(
dist
,
self
.
_dest
,
self
)]
for
_d
in
dists
:
for
_d
in
dists
:
if
_d
not
in
ws
:
if
_d
not
in
ws
:
ws
.
add
(
_d
,
replace
=
True
)
ws
.
add
(
_d
,
replace
=
True
)
...
@@ -660,6 +737,9 @@ class Installer(object):
...
@@ -660,6 +737,9 @@ class Installer(object):
"""Return requirement with optional [versions] constraint added."""
"""Return requirement with optional [versions] constraint added."""
constraint
=
self
.
_versions
.
get
(
requirement
.
project_name
.
lower
())
constraint
=
self
.
_versions
.
get
(
requirement
.
project_name
.
lower
())
if
constraint
:
if
constraint
:
wheel
=
constraint
.
endswith
(
':whl'
)
if
wheel
:
constraint
=
constraint
[:
-
4
]
try
:
try
:
requirement
=
_constrained_requirement
(
constraint
,
requirement
=
_constrained_requirement
(
constraint
,
requirement
)
requirement
)
...
@@ -667,12 +747,15 @@ class Installer(object):
...
@@ -667,12 +747,15 @@ class Installer(object):
logger
.
info
(
self
.
_version_conflict_information
(
logger
.
info
(
self
.
_version_conflict_information
(
requirement
.
project_name
.
lower
()))
requirement
.
project_name
.
lower
()))
raise
raise
if
wheel
:
requirement
.
wheel
=
True
return
requirement
return
requirement
def
install
(
self
,
specs
,
working_set
=
None
):
def
install
(
self
,
specs
,
working_set
=
None
,
patch_dict
=
None
):
logger
.
debug
(
'Installing %s.'
,
repr
(
specs
)[
1
:
-
1
])
logger
.
debug
(
'Installing %s.'
,
repr
(
specs
)[
1
:
-
1
])
__doing__
=
_doing_list
,
self
.
_requirements_and_constraints
self
.
_requirements_and_constraints
.
append
(
self
.
_requirements_and_constraints
.
append
(
"Base installation request: %s"
%
repr
(
specs
)[
1
:
-
1
])
"Base installation request: %s"
%
repr
(
specs
)[
1
:
-
1
])
...
@@ -693,6 +776,9 @@ class Installer(object):
...
@@ -693,6 +776,9 @@ class Installer(object):
ws
=
working_set
ws
=
working_set
for
requirement
in
requirements
:
for
requirement
in
requirements
:
if
patch_dict
and
requirement
.
project_name
in
patch_dict
:
self
.
_env
.
scan
(
self
.
build
(
str
(
requirement
),
{},
patch_dict
=
patch_dict
))
for
dist
in
self
.
_get_dist
(
requirement
,
ws
):
for
dist
in
self
.
_get_dist
(
requirement
,
ws
):
self
.
_maybe_add_setuptools
(
ws
,
dist
)
self
.
_maybe_add_setuptools
(
ws
,
dist
)
...
@@ -706,10 +792,6 @@ class Installer(object):
...
@@ -706,10 +792,6 @@ class Installer(object):
requirements
.
reverse
()
# Set up the stack.
requirements
.
reverse
()
# Set up the stack.
processed
=
{}
# This is a set of processed requirements.
processed
=
{}
# This is a set of processed requirements.
best
=
{}
# This is a mapping of package name -> dist.
best
=
{}
# This is a mapping of package name -> dist.
# Note that we don't use the existing environment, because we want
# to look for new eggs unless what we have is the best that
# matches the requirement.
env
=
pkg_resources
.
Environment
(
ws
.
entries
)
while
requirements
:
while
requirements
:
# Process dependencies breadth-first.
# Process dependencies breadth-first.
...
@@ -721,7 +803,15 @@ class Installer(object):
...
@@ -721,7 +803,15 @@ class Installer(object):
dist
=
best
.
get
(
req
.
key
)
dist
=
best
.
get
(
req
.
key
)
if
dist
is
None
:
if
dist
is
None
:
try
:
try
:
dist
=
env
.
best_match
(
req
,
ws
)
# Note that we first attempt to find an already active dist
# in the working set. This will detect version conflicts.
# XXX We expressly avoid activating dists in the entries of
# the current working set: they might not reflect the order
# of the environment. This is not so bad when the versions
# are pinned, but when calling install(['zc.buildout']), it
# can come up with completely different dists than the ones
# currently running.
dist
=
ws
.
find
(
req
)
except
pkg_resources
.
VersionConflict
as
err
:
except
pkg_resources
.
VersionConflict
as
err
:
logger
.
debug
(
logger
.
debug
(
"Version conflict while processing requirement %s "
"Version conflict while processing requirement %s "
...
@@ -741,6 +831,9 @@ class Installer(object):
...
@@ -741,6 +831,9 @@ class Installer(object):
else
:
else
:
logger
.
debug
(
'Adding required %r'
,
str
(
req
))
logger
.
debug
(
'Adding required %r'
,
str
(
req
))
self
.
_log_requirement
(
ws
,
req
)
self
.
_log_requirement
(
ws
,
req
)
if
patch_dict
and
req
.
project_name
in
patch_dict
:
self
.
_env
.
scan
(
self
.
build
(
str
(
req
),
{},
patch_dict
=
patch_dict
))
for
dist
in
self
.
_get_dist
(
req
,
ws
):
for
dist
in
self
.
_get_dist
(
req
,
ws
):
self
.
_maybe_add_setuptools
(
ws
,
dist
)
self
.
_maybe_add_setuptools
(
ws
,
dist
)
if
dist
not
in
req
:
if
dist
not
in
req
:
...
@@ -787,7 +880,7 @@ class Installer(object):
...
@@ -787,7 +880,7 @@ class Installer(object):
processed
[
req
]
=
True
processed
[
req
]
=
True
return
ws
return
ws
def
build
(
self
,
spec
,
build_ext
):
def
build
(
self
,
spec
,
build_ext
,
patch_dict
=
None
):
requirement
=
self
.
_constrain
(
pkg_resources
.
Requirement
.
parse
(
spec
))
requirement
=
self
.
_constrain
(
pkg_resources
.
Requirement
.
parse
(
spec
))
...
@@ -838,14 +931,33 @@ class Installer(object):
...
@@ -838,14 +931,33 @@ class Installer(object):
)
)
base
=
os
.
path
.
dirname
(
setups
[
0
])
base
=
os
.
path
.
dirname
(
setups
[
0
])
setup_cfg_dict
=
{
'build_ext'
:
build_ext
}
patch_dict
=
(
patch_dict
or
{}).
get
(
re
.
sub
(
'[<>=].*'
,
''
,
spec
))
if
patch_dict
:
setup_cfg_dict
.
update
(
{
'egg_info'
:{
'tag_build'
:
'+%s%03d'
%
(
PATCH_MARKER
,
patch_dict
[
'patch_revision'
])}})
for
i
,
patch
in
enumerate
(
patch_dict
[
'patches'
]):
url
,
md5sum
=
(
patch
.
strip
().
split
(
'#'
,
1
)
+
[
''
])[:
2
]
download
=
zc
.
buildout
.
download
.
Download
()
path
,
is_temp
=
download
(
url
,
md5sum
=
md5sum
or
None
,
path
=
os
.
path
.
join
(
tmp
,
'patch.%s'
%
i
))
args
=
[
patch_dict
[
'patch_binary'
]]
+
patch_dict
[
'patch_options'
]
kwargs
=
{
'cwd'
:
base
,
'stdin'
:
open
(
path
)}
popen
=
subprocess
.
Popen
(
args
,
**
kwargs
)
popen
.
communicate
()
if
popen
.
returncode
!=
0
:
raise
subprocess
.
CalledProcessError
(
popen
.
returncode
,
' '
.
join
(
args
))
setup_cfg
=
os
.
path
.
join
(
base
,
'setup.cfg'
)
setup_cfg
=
os
.
path
.
join
(
base
,
'setup.cfg'
)
if
not
os
.
path
.
exists
(
setup_cfg
):
if
not
os
.
path
.
exists
(
setup_cfg
):
f
=
open
(
setup_cfg
,
'w'
)
f
=
open
(
setup_cfg
,
'w'
)
f
.
close
()
f
.
close
()
setuptools
.
command
.
setopt
.
edit_config
(
setuptools
.
command
.
setopt
.
edit_config
(
setup_cfg
,
dict
(
build_ext
=
build_ext
)
)
setup_cfg
,
setup_cfg_dict
)
dists
=
self
.
_call_pip_
instal
l
(
base
,
self
.
_dest
,
dist
)
dists
=
self
.
_call_pip_
whee
l
(
base
,
self
.
_dest
,
dist
)
return
[
dist
.
location
for
dist
in
dists
]
return
[
dist
.
location
for
dist
in
dists
]
finally
:
finally
:
...
@@ -946,6 +1058,12 @@ def get_picked_versions():
...
@@ -946,6 +1058,12 @@ def get_picked_versions():
required_by
=
Installer
.
_required_by
required_by
=
Installer
.
_required_by
return
(
picked_versions
,
required_by
)
return
(
picked_versions
,
required_by
)
def
extra_paths
(
setting
=
None
):
old
=
Installer
.
_extra_paths
if
setting
is
not
None
:
Installer
.
_extra_paths
=
setting
return
old
def
install
(
specs
,
dest
,
def
install
(
specs
,
dest
,
links
=
(),
index
=
None
,
links
=
(),
index
=
None
,
...
@@ -957,6 +1075,7 @@ def install(specs, dest,
...
@@ -957,6 +1075,7 @@ def install(specs, dest,
allowed_eggs_from_site_packages
=
None
,
allowed_eggs_from_site_packages
=
None
,
check_picked
=
True
,
check_picked
=
True
,
allow_unknown_extras
=
False
,
allow_unknown_extras
=
False
,
patch_dict
=
None
,
):
):
assert
executable
==
sys
.
executable
,
(
executable
,
sys
.
executable
)
assert
executable
==
sys
.
executable
,
(
executable
,
sys
.
executable
)
assert
include_site_packages
is
None
assert
include_site_packages
is
None
...
@@ -968,31 +1087,19 @@ def install(specs, dest,
...
@@ -968,31 +1087,19 @@ def install(specs, dest,
allow_hosts
=
allow_hosts
,
allow_hosts
=
allow_hosts
,
check_picked
=
check_picked
,
check_picked
=
check_picked
,
allow_unknown_extras
=
allow_unknown_extras
)
allow_unknown_extras
=
allow_unknown_extras
)
return
installer
.
install
(
specs
,
working_set
)
return
installer
.
install
(
specs
,
working_set
,
patch_dict
=
patch_dict
)
buildout_and_setuptools_dists
=
list
(
install
([
'zc.buildout'
],
None
,
check_picked
=
False
))
buildout_and_setuptools_path
=
sorted
({
d
.
location
for
d
in
buildout_and_setuptools_dists
})
pip_dists
=
[
d
for
d
in
buildout_and_setuptools_dists
if
d
.
project_name
!=
'zc.buildout'
]
pip_path
=
sorted
({
d
.
location
for
d
in
pip_dists
})
logger
.
debug
(
'after restricting versions: pip_path %r'
,
pip_path
)
pip_pythonpath
=
os
.
pathsep
.
join
(
pip_path
)
setuptools_path
=
pip_path
setuptools_pythonpath
=
pip_pythonpath
def
build
(
spec
,
dest
,
build_ext
,
def
build
(
spec
,
dest
,
build_ext
,
links
=
(),
index
=
None
,
links
=
(),
index
=
None
,
executable
=
sys
.
executable
,
executable
=
sys
.
executable
,
path
=
None
,
newest
=
True
,
versions
=
None
,
allow_hosts
=
(
'*'
,)):
path
=
None
,
newest
=
True
,
versions
=
None
,
allow_hosts
=
(
'*'
,),
patch_dict
=
None
):
assert
executable
==
sys
.
executable
,
(
executable
,
sys
.
executable
)
assert
executable
==
sys
.
executable
,
(
executable
,
sys
.
executable
)
installer
=
Installer
(
dest
,
links
,
index
,
executable
,
installer
=
Installer
(
dest
,
links
,
index
,
executable
,
True
,
path
,
newest
,
True
,
path
,
newest
,
versions
,
allow_hosts
=
allow_hosts
)
versions
,
allow_hosts
=
allow_hosts
)
return
installer
.
build
(
spec
,
build_ext
)
return
installer
.
build
(
spec
,
build_ext
,
patch_dict
=
patch_dict
)
def
_rm
(
*
paths
):
def
_rm
(
*
paths
):
...
@@ -1097,9 +1204,15 @@ def develop(setup, dest,
...
@@ -1097,9 +1204,15 @@ def develop(setup, dest,
undo
.
append
(
lambda
:
os
.
remove
(
tsetup
))
undo
.
append
(
lambda
:
os
.
remove
(
tsetup
))
undo
.
append
(
lambda
:
os
.
close
(
fd
))
undo
.
append
(
lambda
:
os
.
close
(
fd
))
extra_path
=
os
.
environ
.
get
(
'PYTHONEXTRAPATH'
)
extra_path_list
=
[]
if
extra_path
:
extra_path_list
=
extra_path
.
split
(
os
.
pathsep
)
os
.
write
(
fd
,
(
runsetup_template
%
dict
(
os
.
write
(
fd
,
(
runsetup_template
%
dict
(
setupdir
=
directory
,
setupdir
=
directory
,
setup
=
setup
,
setup
=
setup
,
path_list
=
extra_path_list
,
__file__
=
setup
,
__file__
=
setup
,
)).
encode
())
)).
encode
())
...
@@ -1159,6 +1272,10 @@ def scripts(reqs, working_set, executable, dest=None,
...
@@ -1159,6 +1272,10 @@ def scripts(reqs, working_set, executable, dest=None,
if
p
not
in
unique_path
:
if
p
not
in
unique_path
:
unique_path
.
append
(
p
)
unique_path
.
append
(
p
)
path
=
[
realpath
(
p
)
for
p
in
unique_path
]
path
=
[
realpath
(
p
)
for
p
in
unique_path
]
try
:
path
.
remove
(
python_lib
)
except
ValueError
:
pass
generated
=
[]
generated
=
[]
...
@@ -1176,10 +1293,12 @@ def scripts(reqs, working_set, executable, dest=None,
...
@@ -1176,10 +1293,12 @@ def scripts(reqs, working_set, executable, dest=None,
req
=
pkg_resources
.
Requirement
.
parse
(
req
)
req
=
pkg_resources
.
Requirement
.
parse
(
req
)
if
req
.
marker
and
not
req
.
marker
.
evaluate
():
if
req
.
marker
and
not
req
.
marker
.
evaluate
():
continue
continue
has_extras
=
set
(
req
.
extras
).
issuperset
dist
=
working_set
.
find
(
req
)
dist
=
working_set
.
find
(
req
)
# regular console_scripts entry points
# regular console_scripts entry points
for
name
in
pkg_resources
.
get_entry_map
(
dist
,
'console_scripts'
):
for
name
in
pkg_resources
.
get_entry_map
(
dist
,
'console_scripts'
):
entry_point
=
dist
.
get_entry_info
(
'console_scripts'
,
name
)
entry_point
=
dist
.
get_entry_info
(
'console_scripts'
,
name
)
if
has_extras
(
entry_point
.
extras
):
entry_points
.
append
(
entry_points
.
append
(
(
name
,
entry_point
.
module_name
,
(
name
,
entry_point
.
module_name
,
'.'
.
join
(
entry_point
.
attrs
))
'.'
.
join
(
entry_point
.
attrs
))
...
@@ -1329,6 +1448,12 @@ join = os.path.join
...
@@ -1329,6 +1448,12 @@ join = os.path.join
base = os.path.dirname(os.path.abspath(os.path.realpath(__file__)))
base = os.path.dirname(os.path.abspath(os.path.realpath(__file__)))
"""
"""
def
_initialization
(
path
,
initialization
):
return
"""sys.path[0:0] = [
%s,
]
"""
%
path
+
initialization
if
path
else
initialization
def
_script
(
module_name
,
attrs
,
path
,
dest
,
arguments
,
initialization
,
rsetup
):
def
_script
(
module_name
,
attrs
,
path
,
dest
,
arguments
,
initialization
,
rsetup
):
if
is_win32
:
if
is_win32
:
dest
+=
'-script.py'
dest
+=
'-script.py'
...
@@ -1337,11 +1462,10 @@ def _script(module_name, attrs, path, dest, arguments, initialization, rsetup):
...
@@ -1337,11 +1462,10 @@ def _script(module_name, attrs, path, dest, arguments, initialization, rsetup):
contents
=
script_template
%
dict
(
contents
=
script_template
%
dict
(
python
=
python
,
python
=
python
,
path
=
path
,
module_name
=
module_name
,
module_name
=
module_name
,
attrs
=
attrs
,
attrs
=
attrs
,
arguments
=
arguments
,
arguments
=
arguments
,
initialization
=
initialization
,
initialization
=
_initialization
(
path
,
initialization
)
,
relative_paths_setup
=
rsetup
,
relative_paths_setup
=
rsetup
,
)
)
return
_create_script
(
contents
,
dest
)
return
_create_script
(
contents
,
dest
)
...
@@ -1374,8 +1498,7 @@ def _distutils_script(path, dest, script_content, initialization, rsetup):
...
@@ -1374,8 +1498,7 @@ def _distutils_script(path, dest, script_content, initialization, rsetup):
contents
=
distutils_script_template
%
dict
(
contents
=
distutils_script_template
%
dict
(
python
=
python
,
python
=
python
,
path
=
path
,
initialization
=
_initialization
(
path
,
initialization
),
initialization
=
initialization
,
relative_paths_setup
=
rsetup
,
relative_paths_setup
=
rsetup
,
before
=
before
,
before
=
before
,
after
=
after
after
=
after
...
@@ -1443,9 +1566,6 @@ script_template = script_header + '''\
...
@@ -1443,9 +1566,6 @@ script_template = script_header + '''\
%(relative_paths_setup)s
%(relative_paths_setup)s
import sys
import sys
sys.path[0:0] = [
%(path)s,
]
%(initialization)s
%(initialization)s
import %(module_name)s
import %(module_name)s
...
@@ -1457,9 +1577,6 @@ distutils_script_template = script_header + '''
...
@@ -1457,9 +1577,6 @@ distutils_script_template = script_header + '''
%(before)s
%(before)s
%(relative_paths_setup)s
%(relative_paths_setup)s
import sys
import sys
sys.path[0:0] = [
%(path)s,
]
%(initialization)s
%(initialization)s
%(after)s'''
%(after)s'''
...
@@ -1472,14 +1589,12 @@ def _pyscript(path, dest, rsetup, initialization=''):
...
@@ -1472,14 +1589,12 @@ def _pyscript(path, dest, rsetup, initialization=''):
dest
+=
'-script.py'
dest
+=
'-script.py'
python
=
_safe_arg
(
sys
.
executable
)
python
=
_safe_arg
(
sys
.
executable
)
if
path
:
path
+=
','
# Courtesy comma at the end of the list.
contents
=
py_script_template
%
dict
(
contents
=
py_script_template
%
dict
(
python
=
python
,
python
=
python
,
path
=
path
,
path
=
path
,
relative_paths_setup
=
rsetup
,
relative_paths_setup
=
rsetup
,
initialization
=
initialization
,
initialization
=
_initialization
(
path
,
initialization
)
,
)
)
changed
=
_file_changed
(
dest
,
contents
)
changed
=
_file_changed
(
dest
,
contents
)
...
@@ -1514,9 +1629,6 @@ py_script_template = script_header + '''\
...
@@ -1514,9 +1629,6 @@ py_script_template = script_header + '''\
%%(relative_paths_setup)s
%%(relative_paths_setup)s
import sys
import sys
sys.path[0:0] = [
%%(path)s
]
%%(initialization)s
%%(initialization)s
_interactive = True
_interactive = True
...
@@ -1551,8 +1663,14 @@ import sys
...
@@ -1551,8 +1663,14 @@ import sys
sys.path.insert(0, %%(setupdir)r)
sys.path.insert(0, %%(setupdir)r)
sys.path[0:0] = %r
sys.path[0:0] = %r
for extra_path in %%(path_list)r:
sys.path.insert(0, extra_path)
import os, setuptools
import os, setuptools
os.environ['PYTHONPATH'] = (os.pathsep).join(sys.path[:])
__file__ = %%(__file__)r
__file__ = %%(__file__)r
os.chdir(%%(setupdir)r)
os.chdir(%%(setupdir)r)
...
@@ -1595,25 +1713,49 @@ class MissingDistribution(zc.buildout.UserError):
...
@@ -1595,25 +1713,49 @@ class MissingDistribution(zc.buildout.UserError):
req
,
ws
=
self
.
data
req
,
ws
=
self
.
data
return
"Couldn't find a distribution for %r."
%
str
(
req
)
return
"Couldn't find a distribution for %r."
%
str
(
req
)
def
chmod
(
path
):
mode
=
os
.
lstat
(
path
).
st_mode
if
stat
.
S_ISLNK
(
mode
):
return
# give the same permission but write as owner to group and other.
mode
=
stat
.
S_IMODE
(
mode
)
urx
=
(
mode
>>
6
)
&
5
new_mode
=
mode
&
~
0o77
|
urx
<<
3
|
urx
if
new_mode
!=
mode
:
os
.
chmod
(
path
,
new_mode
)
def
redo_pyc
(
egg
):
def
redo_pyc
(
egg
):
if
not
os
.
path
.
isdir
(
egg
):
if
not
os
.
path
.
isdir
(
egg
):
return
return
for
dirpath
,
dirnames
,
filenames
in
os
.
walk
(
egg
):
for
dirpath
,
dirnames
,
filenames
in
os
.
walk
(
egg
):
chmod
(
dirpath
)
for
filename
in
filenames
:
for
filename
in
filenames
:
filepath
=
os
.
path
.
join
(
dirpath
,
filename
)
try
:
chmod
(
filepath
)
except
OSError
as
e
:
if
e
.
errno
!=
errno
.
ENOENT
:
raise
continue
if
not
filename
.
endswith
(
'.py'
):
if
not
filename
.
endswith
(
'.py'
):
continue
continue
filepath
=
os
.
path
.
join
(
dirpath
,
filename
)
if
not
(
os
.
path
.
exists
(
filepath
+
'c'
)
old
=
[]
or
os
.
path
.
exists
(
filepath
+
'o'
)):
pycache
=
os
.
path
.
join
(
dirpath
,
'__pycache__'
,
filename
[:
-
3
]
+
'.*.py'
)
for
suffix
in
'co'
:
if
os
.
path
.
exists
(
filepath
+
suffix
):
old
.
append
(
filepath
+
suffix
)
old
+=
glob
.
glob
(
pycache
+
suffix
)
if
not
old
:
# If it wasn't compiled, it may not be compilable
# If it wasn't compiled, it may not be compilable
continue
continue
# OK, it looks like we should try to compile.
# OK, it looks like we should try to compile.
# Remove old files.
# Remove old files.
for
suffix
in
'co'
:
for
old
in
old
:
if
os
.
path
.
exists
(
filepath
+
suffix
):
os
.
remove
(
old
)
os
.
remove
(
filepath
+
suffix
)
# Compile under current optimization
# Compile under current optimization
try
:
try
:
...
@@ -1657,19 +1799,34 @@ class IncompatibleConstraintError(zc.buildout.UserError):
...
@@ -1657,19 +1799,34 @@ class IncompatibleConstraintError(zc.buildout.UserError):
IncompatibleVersionError
=
IncompatibleConstraintError
# Backward compatibility
IncompatibleVersionError
=
IncompatibleConstraintError
# Backward compatibility
def
call_pip_install
(
spec
,
dest
):
# Temporary HOME with .pydistutils.cfg to disable setup_requires
pip_pydistutils_home
=
tempfile
.
mkdtemp
(
'pip-pydistutils-home'
)
with
open
(
os
.
path
.
join
(
pip_pydistutils_home
,
'.pydistutils.cfg'
),
'w'
)
as
f
:
f
.
write
(
"[easy_install]
\
n
"
"index-url = file:///dev/null"
)
atexit
.
register
(
zc
.
buildout
.
rmtree
.
rmtree
,
pip_pydistutils_home
)
def
call_pip_wheel
(
spec
,
dest
,
options
):
"""
"""
Call `pip
instal
l` from a subprocess to install a
Call `pip
whee
l` from a subprocess to install a
distribution specified by `spec` into `dest`.
distribution specified by `spec` into `dest`.
Returns all the paths inside `dest` created by the above.
Returns all the paths inside `dest` created by the above.
"""
"""
args
=
[
sys
.
executable
,
'-m'
,
'pip'
,
'
install'
,
'--no-deps'
,
'-t
'
,
dest
]
args
=
[
sys
.
executable
,
'-m'
,
'pip'
,
'
wheel'
,
'--no-deps'
,
'-w
'
,
dest
]
level
=
logger
.
getEffectiveLevel
()
level
=
logger
.
getEffectiveLevel
()
if
level
>=
logging
.
INFO
:
if
level
>=
logging
.
INFO
:
args
.
append
(
'-q'
)
args
.
append
(
'-q'
)
else
:
else
:
args
.
append
(
'-v'
)
args
.
append
(
'-v'
)
# Prevent pip from installing build dependencies on the fly
# without respecting pinned versions. This only works for
# PEP 517 specifications using pyproject.toml and not for
# dependencies in setup_requires option in legacy setup.py
if
not
options
.
_allow_picked_versions
:
args
.
append
(
'--no-index'
)
args
.
append
(
'--no-build-isolation'
)
args
.
append
(
spec
)
args
.
append
(
spec
)
try
:
try
:
...
@@ -1678,14 +1835,19 @@ def call_pip_install(spec, dest):
...
@@ -1678,14 +1835,19 @@ def call_pip_install(spec, dest):
except
ImportError
:
except
ImportError
:
HAS_WARNING_OPTION
=
False
HAS_WARNING_OPTION
=
False
if
HAS_WARNING_OPTION
:
if
HAS_WARNING_OPTION
:
if
not
hasattr
(
call_pip_
instal
l
,
'displayed'
):
if
not
hasattr
(
call_pip_
whee
l
,
'displayed'
):
call_pip_
instal
l
.
displayed
=
True
call_pip_
whee
l
.
displayed
=
True
else
:
else
:
args
.
append
(
'--no-python-version-warning'
)
args
.
append
(
'--no-python-version-warning'
)
env
=
copy
.
copy
(
os
.
environ
)
env
=
os
.
environ
.
copy
()
python_path
=
copy
.
copy
(
pip_path
)
python_path
=
pip_path
[:]
python_path
.
append
(
env
.
get
(
'PYTHONPATH'
,
''
))
env_paths
=
env
.
get
(
'PYTHONPATH'
)
if
env_paths
:
python_path
.
append
(
env_paths
)
extra_env_path
=
env
.
get
(
'PYTHONEXTRAPATH'
)
if
extra_env_path
:
python_path
.
append
(
extra_env_path
)
env
[
'PYTHONPATH'
]
=
os
.
pathsep
.
join
(
python_path
)
env
[
'PYTHONPATH'
]
=
os
.
pathsep
.
join
(
python_path
)
if
level
<=
logging
.
DEBUG
:
if
level
<=
logging
.
DEBUG
:
...
@@ -1694,138 +1856,33 @@ def call_pip_install(spec, dest):
...
@@ -1694,138 +1856,33 @@ def call_pip_install(spec, dest):
sys
.
stdout
.
flush
()
# We want any pending output first
sys
.
stdout
.
flush
()
# We want any pending output first
exit_code
=
subprocess
.
call
(
list
(
args
),
env
=
env
)
# Prevent setuptools from downloading and thus installing
# build dependencies specified in setup_requires option of
# legacy setup.py by providing a crafted .pydistutils.cfg.
# This is used in complement to --no-build-isolation.
if
not
options
.
_allow_picked_versions
:
env
[
'HOME'
]
=
pip_pydistutils_home
if
exit_code
:
subprocess
.
check_call
(
args
,
env
=
env
)
logger
.
error
(
"An error occurred when trying to install %s. "
"Look above this message for any errors that "
"were output by pip install."
,
spec
)
sys
.
exit
(
1
)
split_entries
=
[
os
.
path
.
splitext
(
entry
)
for
entry
in
os
.
listdir
(
dest
)]
entries
=
os
.
listdir
(
dest
)
try
:
try
:
distinfo_dir
=
[
assert
len
(
entries
)
==
1
,
"Got multiple entries afer pip wheel"
base
+
ext
for
base
,
ext
in
split_entries
if
ext
==
".dist-info"
wheel
=
entries
[
0
]
][
0
]
assert
os
.
path
.
splitext
(
wheel
)[
1
]
==
'.whl'
,
"Expected a .whl"
except
Index
Error
:
except
Assertion
Error
:
logger
.
error
(
logger
.
error
(
"No .
dist-info directory after successful pip instal
l of %s"
,
"No .
whl after successful pip whee
l of %s"
,
spec
)
spec
)
raise
raise
return
make_egg_after_pip_
install
(
dest
,
distinfo_dir
)
return
make_egg_after_pip_
wheel
(
dest
,
wheel
)
def
make_egg_after_pip_install
(
dest
,
distinfo_dir
):
def
make_egg_after_pip_wheel
(
dest
,
wheel
):
"""build properly named egg directory"""
unpack_wheel
(
os
.
path
.
join
(
dest
,
wheel
),
dest
)
assert
len
(
os
.
listdir
(
dest
))
==
2
# `pip install` does not build the namespace aware __init__.py files
return
glob
.
glob
(
os
.
path
.
join
(
dest
,
'*.egg'
))
# but they are needed in egg directories.
# Add them before moving files setup by pip
namespace_packages_file
=
os
.
path
.
join
(
dest
,
distinfo_dir
,
'namespace_packages.txt'
)
if
os
.
path
.
isfile
(
namespace_packages_file
):
with
open
(
namespace_packages_file
)
as
f
:
namespace_packages
=
[
line
.
strip
().
replace
(
'.'
,
os
.
path
.
sep
)
for
line
in
f
.
readlines
()
]
for
namespace_package
in
namespace_packages
:
namespace_package_dir
=
os
.
path
.
join
(
dest
,
namespace_package
)
if
os
.
path
.
isdir
(
namespace_package_dir
):
init_py_file
=
os
.
path
.
join
(
namespace_package_dir
,
'__init__.py'
)
with
open
(
init_py_file
,
'w'
)
as
f
:
f
.
write
(
"__import__('pkg_resources')."
"declare_namespace(__name__)"
)
# Remove `bin` directory if needed
# as there is no way to avoid script installation
# when running `pip install`
entry_points_file
=
os
.
path
.
join
(
dest
,
distinfo_dir
,
'entry_points.txt'
)
if
os
.
path
.
isfile
(
entry_points_file
):
with
open
(
entry_points_file
)
as
f
:
content
=
f
.
read
()
if
"console_scripts"
in
content
or
"gui_scripts"
in
content
:
bin_dir
=
os
.
path
.
join
(
dest
,
BIN_SCRIPTS
)
if
os
.
path
.
exists
(
bin_dir
):
shutil
.
rmtree
(
bin_dir
)
# Make properly named new egg dir
distro
=
list
(
pkg_resources
.
find_distributions
(
dest
))[
0
]
base
=
"{}-{}"
.
format
(
distro
.
egg_name
(),
pkg_resources
.
get_supported_platform
()
)
egg_name
=
base
+
'.egg'
new_distinfo_dir
=
base
+
'.dist-info'
egg_dir
=
os
.
path
.
join
(
dest
,
egg_name
)
os
.
mkdir
(
egg_dir
)
# Move ".dist-info" dir into new egg dir
os
.
rename
(
os
.
path
.
join
(
dest
,
distinfo_dir
),
os
.
path
.
join
(
egg_dir
,
new_distinfo_dir
)
)
top_level_file
=
os
.
path
.
join
(
egg_dir
,
new_distinfo_dir
,
'top_level.txt'
)
if
os
.
path
.
isfile
(
top_level_file
):
with
open
(
top_level_file
)
as
f
:
top_levels
=
filter
(
(
lambda
x
:
len
(
x
)
!=
0
),
[
line
.
strip
()
for
line
in
f
.
readlines
()]
)
else
:
top_levels
=
()
# Move all top_level modules or packages
for
top_level
in
top_levels
:
# as package
top_level_dir
=
os
.
path
.
join
(
dest
,
top_level
)
if
os
.
path
.
exists
(
top_level_dir
):
shutil
.
move
(
top_level_dir
,
egg_dir
)
continue
# as module
top_level_py
=
top_level_dir
+
'.py'
if
os
.
path
.
exists
(
top_level_py
):
shutil
.
move
(
top_level_py
,
egg_dir
)
top_level_pyc
=
top_level_dir
+
'.pyc'
if
os
.
path
.
exists
(
top_level_pyc
):
shutil
.
move
(
top_level_pyc
,
egg_dir
)
continue
record_file
=
os
.
path
.
join
(
egg_dir
,
new_distinfo_dir
,
'RECORD'
)
if
os
.
path
.
isfile
(
record_file
):
if
PY3
:
with
open
(
record_file
,
newline
=
''
)
as
f
:
all_files
=
[
row
[
0
]
for
row
in
csv
.
reader
(
f
)]
else
:
with
open
(
record_file
,
'rb'
)
as
f
:
all_files
=
[
row
[
0
]
for
row
in
csv
.
reader
(
f
)]
# There might be some c extensions left over
for
entry
in
all_files
:
if
entry
.
endswith
((
'.pyc'
,
'.pyo'
)):
continue
dest_entry
=
os
.
path
.
join
(
dest
,
entry
)
# work around pip install -t bug that leaves entries in RECORD
# that starts with '../../'
if
not
os
.
path
.
abspath
(
dest_entry
).
startswith
(
dest
):
continue
egg_entry
=
os
.
path
.
join
(
egg_dir
,
entry
)
if
os
.
path
.
exists
(
dest_entry
)
and
not
os
.
path
.
exists
(
egg_entry
):
egg_entry_dir
=
os
.
path
.
dirname
(
egg_entry
)
if
not
os
.
path
.
exists
(
egg_entry_dir
):
os
.
makedirs
(
egg_entry_dir
)
os
.
rename
(
dest_entry
,
egg_entry
)
return
[
egg_dir
]
def
unpack_egg
(
location
,
dest
):
def
unpack_egg
(
location
,
dest
):
...
@@ -1872,7 +1929,7 @@ def _get_matching_dist_in_location(dist, location):
...
@@ -1872,7 +1929,7 @@ def _get_matching_dist_in_location(dist, location):
if
dist_infos
==
[(
dist
.
project_name
.
lower
(),
dist
.
parsed_version
)]:
if
dist_infos
==
[(
dist
.
project_name
.
lower
(),
dist
.
parsed_version
)]:
return
dists
.
pop
()
return
dists
.
pop
()
def
_move_to_eggs_dir_and_compile
(
dist
,
dest
):
def
_move_to_eggs_dir_and_compile
(
dist
,
dest
,
options
):
"""Move distribution to the eggs destination directory.
"""Move distribution to the eggs destination directory.
And compile the py files, if we have actually moved the dist.
And compile the py files, if we have actually moved the dist.
...
@@ -1913,7 +1970,7 @@ def _move_to_eggs_dir_and_compile(dist, dest):
...
@@ -1913,7 +1970,7 @@ def _move_to_eggs_dir_and_compile(dist, dest):
unpacker
(
dist
.
location
,
tmp_dest
)
unpacker
(
dist
.
location
,
tmp_dest
)
[
tmp_loc
]
=
glob
.
glob
(
os
.
path
.
join
(
tmp_dest
,
'*'
))
[
tmp_loc
]
=
glob
.
glob
(
os
.
path
.
join
(
tmp_dest
,
'*'
))
else
:
else
:
[
tmp_loc
]
=
call_pip_
install
(
dist
.
location
,
tmp_dest
)
[
tmp_loc
]
=
call_pip_
wheel
(
dist
.
location
,
tmp_dest
,
options
)
installed_with_pip
=
True
installed_with_pip
=
True
# We have installed the dist. Now try to rename/move it.
# We have installed the dist. Now try to rename/move it.
...
@@ -1954,7 +2011,7 @@ def _move_to_eggs_dir_and_compile(dist, dest):
...
@@ -1954,7 +2011,7 @@ def _move_to_eggs_dir_and_compile(dist, dest):
return
newdist
return
newdist
def
sort_working_set
(
ws
,
eggs_dir
,
develop_eggs_dir
):
def
get_develop_paths
(
develop_eggs_dir
):
develop_paths
=
set
()
develop_paths
=
set
()
pattern
=
os
.
path
.
join
(
develop_eggs_dir
,
'*.egg-link'
)
pattern
=
os
.
path
.
join
(
develop_eggs_dir
,
'*.egg-link'
)
for
egg_link
in
glob
.
glob
(
pattern
):
for
egg_link
in
glob
.
glob
(
pattern
):
...
@@ -1962,21 +2019,30 @@ def sort_working_set(ws, eggs_dir, develop_eggs_dir):
...
@@ -1962,21 +2019,30 @@ def sort_working_set(ws, eggs_dir, develop_eggs_dir):
path
=
f
.
readline
().
strip
()
path
=
f
.
readline
().
strip
()
if
path
:
if
path
:
develop_paths
.
add
(
path
)
develop_paths
.
add
(
path
)
return
develop_paths
sorted_paths
=
[]
egg_paths
=
[]
def
sort_working_set
(
ws
,
buildout_dir
,
eggs_dir
,
develop_eggs_dir
):
other_paths
=
[]
develop_paths
=
get_develop_paths
(
develop_eggs_dir
)
dists_priorities
=
tuple
([]
for
i
in
range
(
5
))
for
dist
in
ws
:
for
dist
in
ws
:
path
=
dist
.
location
path
=
dist
.
location
if
path
in
develop_paths
:
if
os
.
path
.
commonprefix
([
path
,
eggs_dir
])
==
eggs_dir
:
sorted_paths
.
append
(
path
)
# Dists from eggs first because we know they contain a single dist.
elif
os
.
path
.
commonprefix
([
path
,
eggs_dir
])
==
eggs_dir
:
priority
=
0
egg_paths
.
append
(
path
)
if
os
.
path
.
commonprefix
([
path
,
buildout_dir
])
==
buildout_dir
:
# We assume internal locations contain a single dist too.
priority
=
1
+
2
*
(
path
not
in
develop_paths
)
# 1 or 3
else
:
else
:
other_paths
.
append
(
path
)
priority
=
2
+
2
*
(
path
not
in
develop_paths
)
# 2 or 4
sorted_paths
.
extend
(
egg_paths
)
dists_priorities
[
priority
].
append
(
dist
)
sorted_paths
.
extend
(
other_paths
)
# We add dists to an empty working set manually instead of adding the paths
return
pkg_resources
.
WorkingSet
(
sorted_paths
)
# to avoid activating other dists at the same locations.
ws
=
pkg_resources
.
WorkingSet
([])
for
dists
in
dists_priorities
:
for
dist
in
dists
:
ws
.
add
(
dist
)
return
ws
NOT_PICKED_AND_NOT_ALLOWED
=
"""
\
NOT_PICKED_AND_NOT_ALLOWED
=
"""
\
...
...
src/zc/buildout/rmtree.py
View file @
3b16f5af
...
@@ -16,7 +16,8 @@
...
@@ -16,7 +16,8 @@
import
shutil
import
shutil
import
os
import
os
import
doctest
import
doctest
import
time
import
errno
import
sys
def
rmtree
(
path
):
def
rmtree
(
path
):
"""
"""
...
@@ -26,6 +27,10 @@ def rmtree (path):
...
@@ -26,6 +27,10 @@ def rmtree (path):
process (e.g. antivirus scanner). This tries to chmod the
process (e.g. antivirus scanner). This tries to chmod the
file to writeable and retries 10 times before giving up.
file to writeable and retries 10 times before giving up.
Also it tries to remove symlink itself if a symlink as passed as
path argument.
Finally, it tries to make parent directory writable.
>>> from tempfile import mkdtemp
>>> from tempfile import mkdtemp
Let's make a directory ...
Let's make a directory ...
...
@@ -41,10 +46,16 @@ def rmtree (path):
...
@@ -41,10 +46,16 @@ def rmtree (path):
>>> foo = os.path.join (d, 'foo')
>>> foo = os.path.join (d, 'foo')
>>> with open (foo, 'w') as f: _ = f.write ('huhu')
>>> with open (foo, 'w') as f: _ = f.write ('huhu')
>>> bar = os.path.join (d, 'bar')
>>> os.symlink(bar, bar)
and make it unwriteable
and make it unwriteable
>>> os.chmod (foo, 256) # 0400
>>> os.chmod (foo, 0o400)
and make parent dir unwritable
>>> os.chmod (d, 0o400)
rmtree should be able to remove it:
rmtree should be able to remove it:
...
@@ -54,21 +65,76 @@ def rmtree (path):
...
@@ -54,21 +65,76 @@ def rmtree (path):
>>> os.path.isdir (d)
>>> os.path.isdir (d)
0
0
Let's make a directory ...
>>> d = mkdtemp()
and make sure it is actually there
>>> os.path.isdir (d)
1
Now create a broken symlink ...
>>> foo = os.path.join (d, 'foo')
>>> os.symlink(foo + '.not_exist', foo)
rmtree should be able to remove it:
>>> rmtree (foo)
and now the directory is gone
>>> os.path.isdir (foo)
0
cleanup directory
>>> rmtree (d)
and now the directory is gone
>>> os.path.isdir (d)
0
"""
def
chmod_retry
(
func
,
failed_path
,
exc_info
):
"""Make sure the directories are executable and writable.
"""
"""
def
retry_writeable
(
func
,
path
,
exc
):
if
func
is
os
.
path
.
islink
:
os
.
chmod
(
path
,
384
)
# 0600
os
.
unlink
(
path
)
for
i
in
range
(
10
):
elif
func
is
os
.
lstat
or
func
is
os
.
open
:
try
:
if
not
os
.
path
.
islink
(
path
):
func
(
path
)
raise
break
os
.
unlink
(
path
)
except
OSError
:
else
:
time
.
sleep
(
0.1
)
# Depending on the Python version, the following items differ.
if
sys
.
version_info
>=
(
3
,
):
expected_error_type
=
PermissionError
expected_func_tuple
=
(
os
.
lstat
,
os
.
open
)
else
:
else
:
# tried 10 times without success, thus
expected_error_type
=
OSError
# finally rethrow the last exception
expected_func_tuple
=
(
os
.
listdir
,
)
e
=
exc_info
[
1
]
if
isinstance
(
e
,
expected_error_type
):
if
e
.
errno
==
errno
.
ENOENT
:
# because we are calling again rmtree on listdir errors, this path might
# have been already deleted by the recursive call to rmtree.
return
if
e
.
errno
==
errno
.
EACCES
:
if
func
in
expected_func_tuple
:
os
.
chmod
(
failed_path
,
0o700
)
# corner case to handle errors in listing directories.
# https://bugs.python.org/issue8523
return
shutil
.
rmtree
(
failed_path
,
onerror
=
chmod_retry
)
# If parent directory is not writable, we still cannot delete the file.
# But make sure not to change the parent of the folder we are deleting.
if
failed_path
!=
path
:
os
.
chmod
(
os
.
path
.
dirname
(
failed_path
),
0o700
)
return
func
(
failed_path
)
raise
raise
shutil
.
rmtree
(
path
,
onerror
=
retry_writeable
)
shutil
.
rmtree
(
path
,
onerror
=
chmod_retry
)
def
test_suite
():
def
test_suite
():
return
doctest
.
DocTestSuite
()
return
doctest
.
DocTestSuite
()
...
...
src/zc/buildout/testing.py
View file @
3b16f5af
...
@@ -23,6 +23,7 @@ except ImportError:
...
@@ -23,6 +23,7 @@ except ImportError:
from
BaseHTTPServer
import
HTTPServer
,
BaseHTTPRequestHandler
from
BaseHTTPServer
import
HTTPServer
,
BaseHTTPRequestHandler
from
urllib2
import
urlopen
from
urllib2
import
urlopen
import
base64
import
errno
import
errno
import
logging
import
logging
import
multiprocessing
import
multiprocessing
...
@@ -222,6 +223,9 @@ class Buildout(zc.buildout.buildout.Buildout):
...
@@ -222,6 +223,9 @@ class Buildout(zc.buildout.buildout.Buildout):
Options
=
TestOptions
Options
=
TestOptions
def
initialize
(
self
,
*
args
):
pass
def
buildoutSetUp
(
test
):
def
buildoutSetUp
(
test
):
test
.
globs
[
'__tear_downs'
]
=
__tear_downs
=
[]
test
.
globs
[
'__tear_downs'
]
=
__tear_downs
=
[]
...
@@ -412,6 +416,23 @@ class Handler(BaseHTTPRequestHandler):
...
@@ -412,6 +416,23 @@ class Handler(BaseHTTPRequestHandler):
self
.
__server
.
__log
=
False
self
.
__server
.
__log
=
False
return
k
()
return
k
()
if
self
.
path
.
startswith
(
'/private/'
):
auth
=
self
.
headers
.
get
(
'Authorization'
)
if
auth
and
auth
.
startswith
(
'Basic '
)
and
\
self
.
path
[
9
:].
encode
()
==
base64
.
b64decode
(
self
.
headers
.
get
(
'Authorization'
)[
6
:]):
return
k
()
# But not returning 401+WWW-Authenticate, we check that the client
# skips auth challenge, which is not free (in terms of performance)
# and useless for what we support.
self
.
send_response
(
403
,
'Forbidden'
)
out
=
'<html><body>Forbidden</body></html>'
.
encode
()
self
.
send_header
(
'Content-Length'
,
str
(
len
(
out
)))
self
.
send_header
(
'Content-Type'
,
'text/html'
)
self
.
end_headers
()
self
.
wfile
.
write
(
out
)
return
path
=
os
.
path
.
abspath
(
os
.
path
.
join
(
self
.
tree
,
*
self
.
path
.
split
(
'/'
)))
path
=
os
.
path
.
abspath
(
os
.
path
.
join
(
self
.
tree
,
*
self
.
path
.
split
(
'/'
)))
if
not
(
if
not
(
((
path
==
self
.
tree
)
or
path
.
startswith
(
self
.
tree
+
os
.
path
.
sep
))
((
path
==
self
.
tree
)
or
path
.
startswith
(
self
.
tree
+
os
.
path
.
sep
))
...
@@ -622,6 +643,8 @@ ignore_not_upgrading = (
...
@@ -622,6 +643,8 @@ ignore_not_upgrading = (
'
Not
upgrading
because
not
running
a
local
buildout
command
.
\
n
'
'
Not
upgrading
because
not
running
a
local
buildout
command
.
\
n
'
), '')
), '')
os.environ['
BUILDOUT_INFO_REINSTALL_REASON
'] = '
0
'
def run_buildout(command):
def run_buildout(command):
# Make sure we don'
t
get
.
buildout
# Make sure we don'
t
get
.
buildout
os
.
environ
[
'HOME'
]
=
os
.
path
.
join
(
os
.
getcwd
(),
'home'
)
os
.
environ
[
'HOME'
]
=
os
.
path
.
join
(
os
.
getcwd
(),
'home'
)
...
...
src/zc/buildout/tests/__init__.py
View file @
3b16f5af
...
@@ -123,6 +123,45 @@ def create_sample_eggs(test, executable=sys.executable):
...
@@ -123,6 +123,45 @@ def create_sample_eggs(test, executable=sys.executable):
)
)
zc
.
buildout
.
testing
.
bdist_egg
(
tmp
,
sys
.
executable
,
dest
)
zc
.
buildout
.
testing
.
bdist_egg
(
tmp
,
sys
.
executable
,
dest
)
write
(
tmp
,
'builddep.py'
,
''
)
write
(
tmp
,
'setup.py'
,
"from setuptools import setup
\
n
"
"setup(name='builddep', "
" py_modules=['builddep'], "
" zip_safe=True, version='0.1')
\
n
"
)
zc
.
buildout
.
testing
.
sdist
(
tmp
,
dest
)
write
(
tmp
,
'withsetuprequires.py'
,
''
)
write
(
tmp
,
'setup.py'
,
"from setuptools import setup
\
n
"
"setup(name='withsetuprequires', "
" setup_requires = 'builddep', "
" py_modules=['withsetuprequires'], "
" zip_safe=True, version='0.1')
\
n
"
"import builddep"
)
zc
.
buildout
.
testing
.
sdist
(
tmp
,
dest
)
write
(
tmp
,
'withbuildsystemrequires.py'
,
''
)
write
(
tmp
,
'pyproject.toml'
,
'[build-system]
\
n
'
'requires = ["builddep"]'
)
write
(
tmp
,
'setup.py'
,
"from setuptools import setup
\
n
"
"setup(name='withbuildsystemrequires', "
" setup_requires = 'builddep', "
" py_modules=['withbuildsystemrequires'], "
" package_data={'withbuildsystemrequires': ['pyproject.toml']}, "
" zip_safe=True, version='0.1')
\
n
"
"import builddep"
)
zc
.
buildout
.
testing
.
sdist
(
tmp
,
dest
)
finally
:
finally
:
shutil
.
rmtree
(
tmp
)
shutil
.
rmtree
(
tmp
)
...
...
src/zc/buildout/tests/allow-unknown-extras.txt
View file @
3b16f5af
...
@@ -40,6 +40,7 @@ Now we can run the buildout and see that it fails:
...
@@ -40,6 +40,7 @@ Now we can run the buildout and see that it fails:
...
...
While:
While:
Installing eggs.
Installing eggs.
Base installation request: 'allowdemo[bad_extra]'
Error: Couldn't find the required extra...
Error: Couldn't find the required extra...
If we flip the option on, the buildout succeeds
If we flip the option on, the buildout succeeds
...
...
src/zc/buildout/tests/allowhosts.txt
View file @
3b16f5af
...
@@ -61,6 +61,8 @@ Now we can run the buildout and make sure all attempts to dist.plone.org fails::
...
@@ -61,6 +61,8 @@ Now we can run the buildout and make sure all attempts to dist.plone.org fails::
...
...
While:
While:
Installing eggs.
Installing eggs.
Base installation request: 'allowdemo'
Requirement of allowdemo: kss.core
Getting distribution for 'kss.core'.
Getting distribution for 'kss.core'.
Error: Couldn't find a distribution for 'kss.core'.
Error: Couldn't find a distribution for 'kss.core'.
...
@@ -92,6 +94,8 @@ Now we can run the buildout and make sure all attempts to dist.plone.org fails::
...
@@ -92,6 +94,8 @@ Now we can run the buildout and make sure all attempts to dist.plone.org fails::
...
...
While:
While:
Installing eggs.
Installing eggs.
Base installation request: 'allowdemo'
Requirement of allowdemo: kss.core
Getting distribution for 'kss.core'.
Getting distribution for 'kss.core'.
Error: Couldn't find a distribution for 'kss.core'.
Error: Couldn't find a distribution for 'kss.core'.
...
...
src/zc/buildout/tests/buildout.txt
View file @
3b16f5af
...
@@ -337,6 +337,10 @@ we'll see that the directory gets removed and recreated::
...
@@ -337,6 +337,10 @@ we'll see that the directory gets removed and recreated::
... path = mydata
... path = mydata
... """)
... """)
>>> print_(system(buildout+' --dry-run'), end='')
Develop: '/sample-buildout/recipes'
Uninstalling data-dir.
Installing data-dir.
>>> print_(system(buildout), end='')
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Develop: '/sample-buildout/recipes'
Uninstalling data-dir.
Uninstalling data-dir.
...
@@ -357,6 +361,10 @@ If any of the files or directories created by a recipe are removed,
...
@@ -357,6 +361,10 @@ If any of the files or directories created by a recipe are removed,
the part will be reinstalled::
the part will be reinstalled::
>>> rmdir(sample_buildout, 'mydata')
>>> rmdir(sample_buildout, 'mydata')
>>> print_(system(buildout+' --dry-run'), end='')
Develop: '/sample-buildout/recipes'
Uninstalling data-dir.
Installing data-dir.
>>> print_(system(buildout), end='')
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Develop: '/sample-buildout/recipes'
Uninstalling data-dir.
Uninstalling data-dir.
...
@@ -816,6 +824,8 @@ the origin of the value (file name or ``COMPUTED_VALUE``, ``DEFAULT_VALUE``,
...
@@ -816,6 +824,8 @@ the origin of the value (file name or ``COMPUTED_VALUE``, ``DEFAULT_VALUE``,
DEFAULT_VALUE
DEFAULT_VALUE
directory= /sample-buildout
directory= /sample-buildout
COMPUTED_VALUE
COMPUTED_VALUE
dry-run= false
DEFAULT_VALUE
eggs-directory= /sample-buildout/eggs
eggs-directory= /sample-buildout/eggs
DEFAULT_VALUE
DEFAULT_VALUE
executable= ...
executable= ...
...
@@ -911,6 +921,11 @@ You get more information about the way values are computed::
...
@@ -911,6 +921,11 @@ You get more information about the way values are computed::
AS COMPUTED_VALUE
AS COMPUTED_VALUE
SET VALUE = /sample-buildout
SET VALUE = /sample-buildout
<BLANKLINE>
<BLANKLINE>
dry-run= false
<BLANKLINE>
AS DEFAULT_VALUE
SET VALUE = false
<BLANKLINE>
eggs-directory= /sample-buildout/eggs
eggs-directory= /sample-buildout/eggs
<BLANKLINE>
<BLANKLINE>
AS DEFAULT_VALUE
AS DEFAULT_VALUE
...
@@ -1269,6 +1284,102 @@ the current section. We can also use the special option,
...
@@ -1269,6 +1284,102 @@ the current section. We can also use the special option,
my_name debug
my_name debug
recipe recipes:debug
recipe recipes:debug
It is possible to have access to profile base url from section by
using ${:_profile_base_location_}:
>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = data-dir debug
... log-level = INFO
...
... [debug]
... recipe = recipes:debug
... profile_base_location = ${:_profile_base_location_}
...
... [data-dir]
... recipe = recipes:mkdir
... path = mydata
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Updating data-dir.
Installing debug.
_profile_base_location_ /sample-buildout
profile_base_location /sample-buildout
recipe recipes:debug
Keep in mind that in case of sections spaning across multiple profiles,
the topmost value will be presented:
>>> write(sample_buildout, 'extended.cfg',
... """
... [debug]
... profile_base_location = ${:_profile_base_location_}
... """)
>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... extends = extended.cfg
... develop = recipes
... parts = data-dir debug
... log-level = INFO
...
... [debug]
... recipe = recipes:debug
... profile_base_location = ${:_profile_base_location_}
...
... [data-dir]
... recipe = recipes:mkdir
... path = mydata
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Updating data-dir.
Updating debug.
_profile_base_location_ /sample-buildout
profile_base_location /sample-buildout
recipe recipes:debug
But of course, in case if accessing happens in extended profile's section,
this profile's location will be exposed:
>>> write(sample_buildout, 'extended.cfg',
... """
... [debug]
... profile_base_location = ${:_profile_base_location_}
... """)
>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... extends = extended.cfg
... develop = recipes
... parts = data-dir debug
... log-level = INFO
...
... [debug]
... recipe = recipes:debug
...
... [data-dir]
... recipe = recipes:mkdir
... path = mydata
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Updating data-dir.
Updating debug.
_profile_base_location_ /sample-buildout
profile_base_location /sample-buildout
recipe recipes:debug
>>> remove(sample_buildout, 'extended.cfg')
Automatic part selection and ordering
Automatic part selection and ordering
-------------------------------------
-------------------------------------
...
@@ -2700,7 +2811,7 @@ were created.
...
@@ -2700,7 +2811,7 @@ were created.
The ``.installed.cfg`` is only updated for the recipes that ran::
The ``.installed.cfg`` is only updated for the recipes that ran::
>>> cat(sample_buildout, '.installed.cfg')
>>> cat(sample_buildout, '.installed.cfg')
... # doctest: +NORMALIZE_WHITESPACE
... # doctest: +NORMALIZE_WHITESPACE
+ELLIPSIS
[buildout]
[buildout]
installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link
installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link
parts = debug d1 d2 d3 d4
parts = debug d1 d2 d3 d4
...
@@ -2730,7 +2841,7 @@ The ``.installed.cfg`` is only updated for the recipes that ran::
...
@@ -2730,7 +2841,7 @@ The ``.installed.cfg`` is only updated for the recipes that ran::
<BLANKLINE>
<BLANKLINE>
[d4]
[d4]
__buildout_installed__ = /sample-buildout/data2-extra
__buildout_installed__ = /sample-buildout/data2-extra
__buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg==
__buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg==
d2:...
path = /sample-buildout/data2-extra
path = /sample-buildout/data2-extra
recipe = recipes:mkdir
recipe = recipes:mkdir
...
@@ -2804,10 +2915,10 @@ provide alternate locations, and even names for these directories::
...
@@ -2804,10 +2915,10 @@ provide alternate locations, and even names for these directories::
Creating directory '/sample-alt/work'.
Creating directory '/sample-alt/work'.
Creating directory '/sample-alt/developbasket'.
Creating directory '/sample-alt/developbasket'.
Develop: '/sample-buildout/recipes'
Develop: '/sample-buildout/recipes'
Uninstalling d4.
Uninstalling d3.
Uninstalling d2.
Uninstalling d2.
Uninstalling debug.
Uninstalling debug.
Uninstalling d4.
Uninstalling d3.
>>> ls(alt)
>>> ls(alt)
d basket
d basket
...
@@ -2915,8 +3026,10 @@ database is shown::
...
@@ -2915,8 +3026,10 @@ database is shown::
bin-directory = /sample-buildout/bin
bin-directory = /sample-buildout/bin
develop-eggs-directory = /sample-buildout/develop-eggs
develop-eggs-directory = /sample-buildout/develop-eggs
directory = /sample-buildout
directory = /sample-buildout
dry-run = false
eggs-directory = /sample-buildout/eggs
eggs-directory = /sample-buildout/eggs
executable = python
executable = python
extra-paths = ...
find-links =
find-links =
install-from-cache = false
install-from-cache = false
installed = /sample-buildout/.installed.cfg
installed = /sample-buildout/.installed.cfg
...
@@ -3234,7 +3347,6 @@ or paths to use::
...
@@ -3234,7 +3347,6 @@ or paths to use::
>>> remove('setup.cfg')
>>> remove('setup.cfg')
>>> print_(system(buildout + ' -csetup.cfg init demo other ./src'), end='')
>>> print_(system(buildout + ' -csetup.cfg init demo other ./src'), end='')
Creating '/sample-bootstrapped/setup.cfg'.
Creating '/sample-bootstrapped/setup.cfg'.
Creating directory '/sample-bootstrapped/develop-eggs'.
Getting distribution for 'zc.recipe.egg>=2.0.6'.
Getting distribution for 'zc.recipe.egg>=2.0.6'.
Got zc.recipe.egg
Got zc.recipe.egg
Installing py.
Installing py.
...
@@ -3293,7 +3405,6 @@ for us::
...
@@ -3293,7 +3405,6 @@ for us::
>>> remove('setup.cfg')
>>> remove('setup.cfg')
>>> print_(system(buildout + ' -csetup.cfg init demo other ./src'), end='')
>>> print_(system(buildout + ' -csetup.cfg init demo other ./src'), end='')
Creating '/sample-bootstrapped/setup.cfg'.
Creating '/sample-bootstrapped/setup.cfg'.
Creating directory '/sample-bootstrapped/develop-eggs'.
Installing py.
Installing py.
Generated script '/sample-bootstrapped/bin/demo'.
Generated script '/sample-bootstrapped/bin/demo'.
Generated script '/sample-bootstrapped/bin/distutilsscript'.
Generated script '/sample-bootstrapped/bin/distutilsscript'.
...
...
src/zc/buildout/tests/dependencylinks.txt
View file @
3b16f5af
...
@@ -87,6 +87,8 @@ buildout to see where the egg comes from this time.
...
@@ -87,6 +87,8 @@ buildout to see where the egg comes from this time.
...
...
While:
While:
Updating eggs.
Updating eggs.
Base installation request: 'depdemo'
Requirement of depdemo: demoneeded
Getting distribution for 'demoneeded'.
Getting distribution for 'demoneeded'.
Error: Couldn't find a distribution for 'demoneeded'.
Error: Couldn't find a distribution for 'demoneeded'.
...
...
src/zc/buildout/tests/download.txt
View file @
3b16f5af
...
@@ -63,6 +63,32 @@ When trying to access a file that doesn't exist, we'll get an exception:
...
@@ -63,6 +63,32 @@ When trying to access a file that doesn't exist, we'll get an exception:
... else: print_('woops')
... else: print_('woops')
download error
download error
An alternate URL can be used in case of HTTPError with the main one.
Useful when a version of a resource can only be downloaded with a temporary
URL as long as it's the last version, and this version is then moved to a
permanent place when a newer version is released. In such case, when using
a cache (in particular networkcache), it's important that the main URL (`url`)
is always used as cache key. And `alternate_url` shall be the temporary URL.
>>> path, is_temp = download(server_url+'not-there',
... alternate_url=server_url+'foo.txt')
>>> cat(path)
This is a foo text.
>>> is_temp
True
>>> remove(path)
The main URL is tried first:
>>> write(server_data, 'other.txt', 'This is some other text.')
>>> path, is_temp = download(server_url+'other.txt',
... alternate_url=server_url+'foo.txt')
>>> cat(path)
This is some other text.
>>> is_temp
True
>>> remove(path)
Downloading a local file doesn't produce a temporary file but simply returns
Downloading a local file doesn't produce a temporary file but simply returns
the local file itself:
the local file itself:
...
@@ -126,6 +152,37 @@ This is a foo text.
...
@@ -126,6 +152,37 @@ This is a foo text.
>>> remove(path)
>>> remove(path)
HTTP basic authentication:
>>> download = Download()
>>> user_url = server_url.replace('/localhost:', '/%s@localhost:') + 'private/'
>>> path, is_temp = download(user_url % 'foo:' + 'foo:')
>>> is_temp; remove(path)
True
>>> path, is_temp = download(user_url % 'foo:bar' + 'foo:bar')
>>> is_temp; remove(path)
True
>>> download(user_url % 'bar:' + 'foo:')
Traceback (most recent call last):
UserError: Error downloading ...: HTTP Error 403: Forbidden
... with netrc:
>>> url = server_url + 'private/foo:bar'
>>> download(url)
Traceback (most recent call last):
UserError: Error downloading ...: HTTP Error 403: Forbidden
>>> import os, zc.buildout.download
>>> old_home = os.environ['HOME']
>>> home = os.environ['HOME'] = tmpdir('test-netrc')
>>> netrc = join(home, '.netrc')
>>> write(netrc, 'machine localhost login foo password bar')
>>> os.chmod(netrc, 0o600)
>>> zc.buildout.download.netrc.__init__()
>>> path, is_temp = download(url)
>>> is_temp; remove(path)
True
>>> os.environ['HOME'] = old_home
Downloading using the download cache
Downloading using the download cache
------------------------------------
------------------------------------
...
@@ -165,14 +222,6 @@ the file on the server to see this:
...
@@ -165,14 +222,6 @@ the file on the server to see this:
>>> cat(path)
>>> cat(path)
This is a foo text.
This is a foo text.
If we specify an MD5 checksum for a file that is already in the cache, the
cached copy's checksum will be verified:
>>> download(server_url+'foo.txt', md5('The wrong text.'.encode()).hexdigest())
Traceback (most recent call last):
ChecksumError: MD5 checksum mismatch for cached download
from 'http://localhost/foo.txt' at '/download-cache/foo.txt'
Trying to access another file at a different URL which has the same base name
Trying to access another file at a different URL which has the same base name
will result in the cached copy being used:
will result in the cached copy being used:
...
@@ -184,6 +233,14 @@ will result in the cached copy being used:
...
@@ -184,6 +233,14 @@ will result in the cached copy being used:
>>> cat(path)
>>> cat(path)
This is a foo text.
This is a foo text.
If we specify an MD5 checksum for a file that is already in the cache, the
cached copy's checksum will be verified and the cache will be refreshed:
>>> path, is_temp = download(server_url+'foo.txt', md5('The wrong text.'.encode()).hexdigest())
>>> is_temp
True
>>> remove(path)
Given a target path for the download, the utility will provide a copy of the
Given a target path for the download, the utility will provide a copy of the
file at that location both when first downloading the file and when using a
file at that location both when first downloading the file and when using a
cached copy:
cached copy:
...
@@ -259,7 +316,7 @@ If the file is completely missing it should notify the user of the error:
...
@@ -259,7 +316,7 @@ If the file is completely missing it should notify the user of the error:
>>> download(server_url+'bar.txt') # doctest: +NORMALIZE_WHITESPACE +ELLIPSIS
>>> download(server_url+'bar.txt') # doctest: +NORMALIZE_WHITESPACE +ELLIPSIS
Traceback (most recent call last):
Traceback (most recent call last):
...
...
UserError: Error downloading
extends for URL
http://localhost/bar.txt:
UserError: Error downloading http://localhost/bar.txt:
...404...
...404...
>>> ls(cache)
>>> ls(cache)
...
@@ -442,18 +499,22 @@ However, when downloading the file normally with the cache being used in
...
@@ -442,18 +499,22 @@ However, when downloading the file normally with the cache being used in
fall-back mode, the file will be downloaded from the net and the cached copy
fall-back mode, the file will be downloaded from the net and the cached copy
will be replaced with the new content:
will be replaced with the new content:
>>> cat(download(server_url+'foo.txt')[0])
>>> path, is_temp = download(server_url+'foo.txt')
>>> cat(path)
The wrong text.
The wrong text.
>>> cat(cache, 'foo.txt')
>>> cat(cache, 'foo.txt')
The wrong text.
The wrong text.
>>> is_temp
True
>>> remove(path)
When trying to download a resource whose checksum does not match, the cached
Fall-back mode is meaningless if md5sum is given. If the checksum of the
c
opy will neither be used nor overwritten
:
c
ached copy matches, the resource is not downloaded
:
>>> write(server_data, 'foo.txt', 'This is a foo text.')
>>> write(server_data, 'foo.txt', 'This is a foo text.')
>>> download(server_url+'foo.txt', md5('The wrong text.'.encode()).hexdigest())
>>>
path, is_temp =
download(server_url+'foo.txt', md5('The wrong text.'.encode()).hexdigest())
Traceback (most recent call last):
>>> print_(path)
ChecksumError: MD5 checksum mismatch downloading 'http://localhost/foo.txt'
/download-cache/foo.txt
>>> cat(cache, 'foo.txt')
>>> cat(cache, 'foo.txt')
The wrong text.
The wrong text.
...
...
src/zc/buildout/tests/downloadcache.txt
View file @
3b16f5af
...
@@ -33,11 +33,12 @@ download:
...
@@ -33,11 +33,12 @@ download:
>>> print_(get(link_server), end='')
>>> print_(get(link_server), end='')
<html><body>
<html><body>
<a href="bigdemo-0.1-py2.4.egg">bigdemo-0.1-py2.4.egg</a><br>
<a href="bigdemo-0.1-pyN.N.egg">bigdemo-0.1-pyN.N.egg</a><br>
<a href="demo-0.1-py2.4.egg">demo-0.1-py2.4.egg</a><br>
<a href="builddep-0.1.zip">builddep-0.1.zip</a><br>
<a href="demo-0.2-py2.4.egg">demo-0.2-py2.4.egg</a><br>
<a href="demo-0.1-pyN.N.egg">demo-0.1-pyN.N.egg</a><br>
<a href="demo-0.3-py2.4.egg">demo-0.3-py2.4.egg</a><br>
<a href="demo-0.2-pyN.N.egg">demo-0.2-pyN.N.egg</a><br>
<a href="demo-0.4rc1-py2.4.egg">demo-0.4rc1-py2.4.egg</a><br>
<a href="demo-0.3-pyN.N.egg">demo-0.3-pyN.N.egg</a><br>
<a href="demo-0.4rc1-pyN.N.egg">demo-0.4rc1-pyN.N.egg</a><br>
<a href="demoneeded-1.0.zip">demoneeded-1.0.zip</a><br>
<a href="demoneeded-1.0.zip">demoneeded-1.0.zip</a><br>
<a href="demoneeded-1.1.zip">demoneeded-1.1.zip</a><br>
<a href="demoneeded-1.1.zip">demoneeded-1.1.zip</a><br>
<a href="demoneeded-1.2rc1.zip">demoneeded-1.2rc1.zip</a><br>
<a href="demoneeded-1.2rc1.zip">demoneeded-1.2rc1.zip</a><br>
...
@@ -45,7 +46,9 @@ download:
...
@@ -45,7 +46,9 @@ download:
<a href="extdemo-1.4.zip">extdemo-1.4.zip</a><br>
<a href="extdemo-1.4.zip">extdemo-1.4.zip</a><br>
<a href="index/">index/</a><br>
<a href="index/">index/</a><br>
<a href="mixedcase-0.5.zip">mixedcase-0.5.zip</a><br>
<a href="mixedcase-0.5.zip">mixedcase-0.5.zip</a><br>
<a href="other-1.0-py2.4.egg">other-1.0-py2.4.egg</a><br>
<a href="other-1.0-pyN.N.egg">other-1.0-pyN.N.egg</a><br>
<a href="withbuildsystemrequires-0.1.zip">withbuildsystemrequires-0.1.zip</a><br>
<a href="withsetuprequires-0.1.zip">withsetuprequires-0.1.zip</a><br>
</body></html>
</body></html>
...
...
src/zc/buildout/tests/easy_install.txt
View file @
3b16f5af
...
@@ -97,11 +97,12 @@ We have a link server that has a number of eggs:
...
@@ -97,11 +97,12 @@ We have a link server that has a number of eggs:
>>> print_(get(link_server), end='')
>>> print_(get(link_server), end='')
<html><body>
<html><body>
<a href="bigdemo-0.1-py2.4.egg">bigdemo-0.1-py2.4.egg</a><br>
<a href="bigdemo-0.1-pyN.N.egg">bigdemo-0.1-pyN.N.egg</a><br>
<a href="demo-0.1-py2.4.egg">demo-0.1-py2.4.egg</a><br>
<a href="builddep-0.1.zip">builddep-0.1.zip</a><br>
<a href="demo-0.2-py2.4.egg">demo-0.2-py2.4.egg</a><br>
<a href="demo-0.1-pyN.N.egg">demo-0.1-pyN.N.egg</a><br>
<a href="demo-0.3-py2.4.egg">demo-0.3-py2.4.egg</a><br>
<a href="demo-0.2-pyN.N.egg">demo-0.2-pyN.N.egg</a><br>
<a href="demo-0.4rc1-py2.4.egg">demo-0.4rc1-py2.4.egg</a><br>
<a href="demo-0.3-pyN.N.egg">demo-0.3-pyN.N.egg</a><br>
<a href="demo-0.4rc1-pyN.N.egg">demo-0.4rc1-pyN.N.egg</a><br>
<a href="demoneeded-1.0.zip">demoneeded-1.0.zip</a><br>
<a href="demoneeded-1.0.zip">demoneeded-1.0.zip</a><br>
<a href="demoneeded-1.1.zip">demoneeded-1.1.zip</a><br>
<a href="demoneeded-1.1.zip">demoneeded-1.1.zip</a><br>
<a href="demoneeded-1.2rc1.zip">demoneeded-1.2rc1.zip</a><br>
<a href="demoneeded-1.2rc1.zip">demoneeded-1.2rc1.zip</a><br>
...
@@ -109,7 +110,9 @@ We have a link server that has a number of eggs:
...
@@ -109,7 +110,9 @@ We have a link server that has a number of eggs:
<a href="extdemo-1.4.zip">extdemo-1.4.zip</a><br>
<a href="extdemo-1.4.zip">extdemo-1.4.zip</a><br>
<a href="index/">index/</a><br>
<a href="index/">index/</a><br>
<a href="mixedcase-0.5.zip">mixedcase-0.5.zip</a><br>
<a href="mixedcase-0.5.zip">mixedcase-0.5.zip</a><br>
<a href="other-1.0-py2.4.egg">other-1.0-py2.4.egg</a><br>
<a href="other-1.0-pyN.N.egg">other-1.0-pyN.N.egg</a><br>
<a href="withbuildsystemrequires-0.1.zip">withbuildsystemrequires-0.1.zip</a><br>
<a href="withsetuprequires-0.1.zip">withsetuprequires-0.1.zip</a><br>
</body></html>
</body></html>
Let's make a directory and install the demo egg to it, using the demo:
Let's make a directory and install the demo egg to it, using the demo:
...
@@ -765,9 +768,9 @@ An interpreter can also be generated without other eggs:
...
@@ -765,9 +768,9 @@ An interpreter can also be generated without other eggs:
<BLANKLINE>
<BLANKLINE>
import sys
import sys
<BLANKLINE>
<BLANKLINE>
sys.path[0:0] = [
<BLANKLINE>
<BLANKLINE>
]
<BLANKLINE>
_interactive = True
...
...
An additional argument can be passed to define which scripts to install
An additional argument can be passed to define which scripts to install
...
@@ -1233,11 +1236,12 @@ Let's update our link server with a new version of extdemo:
...
@@ -1233,11 +1236,12 @@ Let's update our link server with a new version of extdemo:
>>> update_extdemo()
>>> update_extdemo()
>>> print_(get(link_server), end='')
>>> print_(get(link_server), end='')
<html><body>
<html><body>
<a href="bigdemo-0.1-py2.4.egg">bigdemo-0.1-py2.4.egg</a><br>
<a href="bigdemo-0.1-pyN.N.egg">bigdemo-0.1-pyN.N.egg</a><br>
<a href="demo-0.1-py2.4.egg">demo-0.1-py2.4.egg</a><br>
<a href="builddep-0.1.zip">builddep-0.1.zip</a><br>
<a href="demo-0.2-py2.4.egg">demo-0.2-py2.4.egg</a><br>
<a href="demo-0.1-pyN.N.egg">demo-0.1-pyN.N.egg</a><br>
<a href="demo-0.3-py2.4.egg">demo-0.3-py2.4.egg</a><br>
<a href="demo-0.2-pyN.N.egg">demo-0.2-pyN.N.egg</a><br>
<a href="demo-0.4rc1-py2.4.egg">demo-0.4rc1-py2.4.egg</a><br>
<a href="demo-0.3-pyN.N.egg">demo-0.3-pyN.N.egg</a><br>
<a href="demo-0.4rc1-pyN.N.egg">demo-0.4rc1-pyN.N.egg</a><br>
<a href="demoneeded-1.0.zip">demoneeded-1.0.zip</a><br>
<a href="demoneeded-1.0.zip">demoneeded-1.0.zip</a><br>
<a href="demoneeded-1.1.zip">demoneeded-1.1.zip</a><br>
<a href="demoneeded-1.1.zip">demoneeded-1.1.zip</a><br>
<a href="demoneeded-1.2rc1.zip">demoneeded-1.2rc1.zip</a><br>
<a href="demoneeded-1.2rc1.zip">demoneeded-1.2rc1.zip</a><br>
...
@@ -1246,7 +1250,9 @@ Let's update our link server with a new version of extdemo:
...
@@ -1246,7 +1250,9 @@ Let's update our link server with a new version of extdemo:
<a href="extdemo-1.5.zip">extdemo-1.5.zip</a><br>
<a href="extdemo-1.5.zip">extdemo-1.5.zip</a><br>
<a href="index/">index/</a><br>
<a href="index/">index/</a><br>
<a href="mixedcase-0.5.zip">mixedcase-0.5.zip</a><br>
<a href="mixedcase-0.5.zip">mixedcase-0.5.zip</a><br>
<a href="other-1.0-py2.4.egg">other-1.0-py2.4.egg</a><br>
<a href="other-1.0-pyN.N.egg">other-1.0-pyN.N.egg</a><br>
<a href="withbuildsystemrequires-0.1.zip">withbuildsystemrequires-0.1.zip</a><br>
<a href="withsetuprequires-0.1.zip">withsetuprequires-0.1.zip</a><br>
</body></html>
</body></html>
The easy_install caches information about servers to reduce network
The easy_install caches information about servers to reduce network
...
@@ -1445,9 +1451,8 @@ Now when we install the distributions:
...
@@ -1445,9 +1451,8 @@ Now when we install the distributions:
... ['demo==0.2'], dest,
... ['demo==0.2'], dest,
... links=[link_server], index=link_server+'index/')
... links=[link_server], index=link_server+'index/')
GET 200 /
GET 200 /
GET 404 /index/demo/
GET 200 /index/
GET 404 /index/demoneeded/
GET 404 /index/demoneeded/
GET 200 /index/
>>> zc.buildout.easy_install.build(
>>> zc.buildout.easy_install.build(
... 'extdemo', dest,
... 'extdemo', dest,
...
@@ -1469,6 +1474,7 @@ from the link server:
...
@@ -1469,6 +1474,7 @@ from the link server:
>>> ws = zc.buildout.easy_install.install(
>>> ws = zc.buildout.easy_install.install(
... ['demo'], dest,
... ['demo'], dest,
... links=[link_server], index=link_server+'index/')
... links=[link_server], index=link_server+'index/')
GET 404 /index/demo/
GET 200 /demo-0.3-py2.4.egg
GET 200 /demo-0.3-py2.4.egg
Normally, the download cache is the preferred source of downloads, but
Normally, the download cache is the preferred source of downloads, but
...
...
src/zc/buildout/tests/extends-cache.txt
→
src/zc/buildout/tests/extends-cache.txt
.disabled
View file @
3b16f5af
...
@@ -492,9 +492,9 @@ a better solution would re-use the logging already done by the utility.)
...
@@ -492,9 +492,9 @@ a better solution would re-use the logging already done by the utility.)
>>> import zc.buildout
>>> import zc.buildout
>>> old_download = zc.buildout.download.Download.download
>>> old_download = zc.buildout.download.Download.download
>>> def wrapper_download(self, url,
md5sum=None, path=None
):
>>> def wrapper_download(self, url,
*args, **kw
):
... print_("The URL %s was downloaded." % url)
... print_("The URL %s was downloaded." % url)
... return old_download(url,
md5sum, path
)
... return old_download(url,
*args, **kw
)
>>> zc.buildout.download.Download.download = wrapper_download
>>> zc.buildout.download.Download.download = wrapper_download
>>> zc.buildout.buildout.main([])
>>> zc.buildout.buildout.main([])
...
...
src/zc/buildout/tests/repeatable.txt
View file @
3b16f5af
...
@@ -207,6 +207,7 @@ versions:
...
@@ -207,6 +207,7 @@ versions:
Getting section foo.
Getting section foo.
Initializing section foo.
Initializing section foo.
Installing recipe spam.
Installing recipe spam.
Base installation request: 'spam'
Getting distribution for 'spam'.
Getting distribution for 'spam'.
Error: Picked: spam = 2
Error: Picked: spam = 2
...
...
...
...
src/zc/buildout/tests/test_all.py
View file @
3b16f5af
...
@@ -143,7 +143,8 @@ class TestEasyInstall(unittest.TestCase):
...
@@ -143,7 +143,8 @@ class TestEasyInstall(unittest.TestCase):
result
=
zc
.
buildout
.
easy_install
.
_move_to_eggs_dir_and_compile
(
result
=
zc
.
buildout
.
easy_install
.
_move_to_eggs_dir_and_compile
(
dist
,
dist
,
dest
dest
,
None
,
# ok because we don't fallback to pip
)
)
self
.
assertIsNotNone
(
result
)
self
.
assertIsNotNone
(
result
)
...
@@ -433,6 +434,9 @@ Now, let's create a buildout that requires y and z:
...
@@ -433,6 +434,9 @@ Now, let's create a buildout that requires y and z:
Requirement of sampley: demoneeded==1.0
Requirement of sampley: demoneeded==1.0
While:
While:
Installing eggs.
Installing eggs.
Base installation request: 'sampley', 'samplez'
Requirement of samplez: demoneeded==1.1
Requirement of sampley: demoneeded==1.0
Error: There is a version conflict.
Error: There is a version conflict.
We already have: demoneeded 1.1
We already have: demoneeded 1.1
but sampley 1 requires 'demoneeded==1.0'.
but sampley 1 requires 'demoneeded==1.0'.
...
@@ -483,6 +487,12 @@ If we use the verbose switch, we can see where requirements are coming from:
...
@@ -483,6 +487,12 @@ If we use the verbose switch, we can see where requirements are coming from:
Requirement of sampley: demoneeded==1.0
Requirement of sampley: demoneeded==1.0
While:
While:
Installing eggs.
Installing eggs.
Base installation request: 'samplea', 'samplez'
Requirement of samplez: demoneeded==1.1
Requirement of samplea: sampleb
Requirement of sampleb: samplea
Requirement of sampleb: sampley
Requirement of sampley: demoneeded==1.0
Error: There is a version conflict.
Error: There is a version conflict.
We already have: demoneeded 1.1
We already have: demoneeded 1.1
but sampley 1 requires 'demoneeded==1.0'.
but sampley 1 requires 'demoneeded==1.0'.
...
@@ -551,6 +561,11 @@ that we can't find. when run in verbose mode
...
@@ -551,6 +561,11 @@ that we can't find. when run in verbose mode
...
...
While:
While:
Installing eggs.
Installing eggs.
Base installation request: 'samplea'
Requirement of samplea: sampleb
Requirement of sampleb: samplea
Requirement of sampleb: sampley
Requirement of sampley: demoneeded
Getting distribution for 'demoneeded'.
Getting distribution for 'demoneeded'.
Error: Couldn't find a distribution for 'demoneeded'.
Error: Couldn't find a distribution for 'demoneeded'.
"""
"""
...
@@ -1181,6 +1196,8 @@ Uninstall recipes need to be called when a part is removed too:
...
@@ -1181,6 +1196,8 @@ Uninstall recipes need to be called when a part is removed too:
uninstalling
uninstalling
Installing demo.
Installing demo.
installing
installing
Section `demo` contains unused option(s): 'x'.
This may be an indication for either a typo in the option's name or a bug in the used recipe.
>>> write('buildout.cfg', '''
>>> write('buildout.cfg', '''
...
@@ -1679,7 +1696,7 @@ some evil recipes that exit uncleanly:
...
@@ -1679,7 +1696,7 @@ some evil recipes that exit uncleanly:
>>> mkdir('recipes')
>>> mkdir('recipes')
>>> write('recipes', 'recipes.py',
>>> write('recipes', 'recipes.py',
... '''
... '''
... import
o
s
... import
sy
s
...
...
... class Clean:
... class Clean:
... def __init__(*_): pass
... def __init__(*_): pass
...
@@ -1687,10 +1704,10 @@ some evil recipes that exit uncleanly:
...
@@ -1687,10 +1704,10 @@ some evil recipes that exit uncleanly:
... def update(_): pass
... def update(_): pass
...
...
... class EvilInstall(Clean):
... class EvilInstall(Clean):
... def install(_):
os._
exit(1)
... def install(_):
sys.
exit(1)
...
...
... class EvilUpdate(Clean):
... class EvilUpdate(Clean):
... def update(_):
os._
exit(1)
... def update(_):
sys.
exit(1)
... ''')
... ''')
>>> write('recipes', 'setup.py',
>>> write('recipes', 'setup.py',
...
@@ -1784,10 +1801,10 @@ Now let's look at 3 cases:
...
@@ -1784,10 +1801,10 @@ Now let's look at 3 cases:
>>> print_(system(buildout+' buildout:parts='), end='')
>>> print_(system(buildout+' buildout:parts='), end='')
Develop: '/sample-buildout/recipes'
Develop: '/sample-buildout/recipes'
Uninstalling p2.
Uninstalling p1.
Uninstalling p4.
Uninstalling p4.
Uninstalling p3.
Uninstalling p3.
Uninstalling p2.
Uninstalling p1.
3. We exit while installing or updating after uninstalling:
3. We exit while installing or updating after uninstalling:
...
@@ -2214,6 +2231,28 @@ def dealing_with_extremely_insane_dependencies():
...
@@ -2214,6 +2231,28 @@ def dealing_with_extremely_insane_dependencies():
...
...
While:
While:
Installing pack1.
Installing pack1.
Base installation request: 'pack0'
Requirement of pack0: pack4
Requirement of pack0: pack3
Requirement of pack0: pack2
Requirement of pack0: pack1
Requirement of pack4: pack5
Requirement of pack4: pack3
Requirement of pack4: pack2
Requirement of pack4: pack1
Requirement of pack4: pack0
Requirement of pack3: pack4
Requirement of pack3: pack2
Requirement of pack3: pack1
Requirement of pack3: pack0
Requirement of pack2: pack4
Requirement of pack2: pack3
Requirement of pack2: pack1
Requirement of pack2: pack0
Requirement of pack1: pack4
Requirement of pack1: pack3
Requirement of pack1: pack2
Requirement of pack1: pack0
Getting distribution for 'pack5'.
Getting distribution for 'pack5'.
Error: Couldn't find a distribution for 'pack5'.
Error: Couldn't find a distribution for 'pack5'.
...
@@ -2255,10 +2294,209 @@ def dealing_with_extremely_insane_dependencies():
...
@@ -2255,10 +2294,209 @@ def dealing_with_extremely_insane_dependencies():
...
...
While:
While:
Installing pack1.
Installing pack1.
Base installation request: 'pack0'
Requirement of pack0: pack4
Requirement of pack0: pack3
Requirement of pack0: pack2
Requirement of pack0: pack1
Requirement of pack4: pack5
Requirement of pack4: pack3
Requirement of pack4: pack2
Requirement of pack4: pack1
Requirement of pack4: pack0
Requirement of pack3: pack4
Requirement of pack3: pack2
Requirement of pack3: pack1
Requirement of pack3: pack0
Requirement of pack2: pack4
Requirement of pack2: pack3
Requirement of pack2: pack1
Requirement of pack2: pack0
Requirement of pack1: pack4
Requirement of pack1: pack3
Requirement of pack1: pack2
Requirement of pack1: pack0
Getting distribution for 'pack5'.
Getting distribution for 'pack5'.
Error: Couldn't find a distribution for 'pack5'.
Error: Couldn't find a distribution for 'pack5'.
"""
"""
def
test_part_pulled_by_recipe
():
"""
>>> mkdir(sample_buildout, 'recipes')
>>> write(sample_buildout, 'recipes', 'test.py',
... '''
... class Recipe:
...
... def __init__(self, buildout, name, options):
... self.x = buildout[options['x']][name]
...
... def install(self):
... print(self.x)
... return ()
...
... update = install
... ''')
>>> write(sample_buildout, 'recipes', 'setup.py',
... '''
... from setuptools import setup
... setup(
... name = "recipes",
... entry_points = {'zc.buildout': ['test = test:Recipe']},
... )
... ''')
>>> write(sample_buildout, 'buildout.cfg',
... '''
... [buildout]
... develop = recipes
... parts = a
... [a]
... recipe = recipes:test
... x = b
... [b]
... <= a
... a = A
... b = B
... c = ${c:x}
... [c]
... x = c
... ''')
>>> os.chdir(sample_buildout)
>>> buildout = os.path.join(sample_buildout, 'bin', 'buildout')
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Installing b.
B
Section `b` contains unused option(s): 'c'.
This may be an indication for either a typo in the option's name or a bug in the used recipe.
Installing a.
A
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Updating b.
B
Updating a.
A
>>> cat('.installed.cfg') # doctest: +ELLIPSIS
[buildout]
...
[b]
__buildout_installed__ =
__buildout_signature__ = recipes-... c:...
...
[a]
__buildout_installed__ =
__buildout_signature__ = recipes-... b:...
...
"""
def
test_recipe_options_are_escaped
():
"""
>>> mkdir(sample_buildout, 'recipes')
>>> write(sample_buildout, 'recipes', 'test.py',
... '''
... class Recipe:
...
... def __init__(self, buildout, name, options):
... options['option'] = '${buildout_syntax_should_be_escaped}'
... print ("Option value: %s" % options['option'])
...
... def install(self):
... return ()
...
... update = install
... ''')
>>> write(sample_buildout, 'recipes', 'setup.py',
... '''
... from setuptools import setup
... setup(
... name = "recipes",
... entry_points = {'zc.buildout': ['test = test:Recipe']},
... )
... ''')
>>> write(sample_buildout, 'buildout.cfg',
... '''
... [buildout]
... develop = recipes
... parts = a
... [a]
... recipe = recipes:test
... ''')
>>> os.chdir(sample_buildout)
>>> buildout = os.path.join(sample_buildout, 'bin', 'buildout')
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Option value: ${buildout_syntax_should_be_escaped}
Installing a.
>>> cat('.installed.cfg') # doctest: +ELLIPSIS
[buildout]
...
[a]
__buildout_installed__ =
__buildout_signature__ = recipes-...
option = $${buildout_syntax_should_be_escaped}
recipe = recipes:test
"""
def
test_recipe_invalid_options_are_rejected
():
r"""
>>> mkdir(sample_buildout, 'recipes')
>>> write(sample_buildout, 'recipes', 'test.py',
... '''
... class Recipe:
...
... def __init__(self, buildout, name, options):
... options['[section]\\noption'] = 'invalid'
...
... def install(self):
... return ()
...
... update = install
... ''')
>>> write(sample_buildout, 'recipes', 'setup.py',
... '''
... from setuptools import setup
... setup(
... name = "recipes",
... entry_points = {'zc.buildout': ['test = test:Recipe']},
... )
... ''')
>>> write(sample_buildout, 'buildout.cfg',
... '''
... [buildout]
... develop = recipes
... parts = a
... [a]
... recipe = recipes:test
... ''')
>>> os.chdir(sample_buildout)
>>> buildout = os.path.join(sample_buildout, 'bin', 'buildout')
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
While:
Installing.
Getting section a.
Initializing section a.
Error: Invalid option name '[section]\noption'
"""
def
read_find_links_to_load_extensions
():
def
read_find_links_to_load_extensions
():
r"""
r"""
We'll create a wacky buildout extension that just announces itself when used:
We'll create a wacky buildout extension that just announces itself when used:
...
@@ -2572,7 +2810,7 @@ def wont_downgrade_due_to_prefer_final():
...
@@ -2572,7 +2810,7 @@ def wont_downgrade_due_to_prefer_final():
If we install a non-final buildout version, we don't want to
If we install a non-final buildout version, we don't want to
downgrade just because we prefer-final. If a buildout version
downgrade just because we prefer-final. If a buildout version
isn't specified using a versions entry, then buildout's version
isn't specified using a versions entry, then buildout's version
requirement gets set to >=CURRENT_VERSION.
requirement gets set to >=
PUBLIC_PART_OF_
CURRENT_VERSION.
>>> write('buildout.cfg',
>>> write('buildout.cfg',
... '''
... '''
...
@@ -2585,7 +2823,7 @@ def wont_downgrade_due_to_prefer_final():
...
@@ -2585,7 +2823,7 @@ def wont_downgrade_due_to_prefer_final():
... if l.startswith('zc.buildout = >=')]
... if l.startswith('zc.buildout = >=')]
>>> v == pkg_resources.working_set.find(
>>> v == pkg_resources.working_set.find(
... pkg_resources.Requirement.parse('zc.buildout')
... pkg_resources.Requirement.parse('zc.buildout')
... ).
version
... ).
parsed_version.public
True
True
>>> write('buildout.cfg',
>>> write('buildout.cfg',
...
@@ -2677,7 +2915,8 @@ honoring our version specification.
...
@@ -2677,7 +2915,8 @@ honoring our version specification.
... eggs = foo
... eggs = foo
... ''' % ('\n'.join(
... ''' % ('\n'.join(
... '%s = %s' % (d.key, d.version)
... '%s = %s' % (d.key, d.version)
... for d in zc.buildout.easy_install.buildout_and_setuptools_dists)))
... for d in pkg_resources.working_set.resolve(
... pkg_resources.parse_requirements('zc.buildout')))))
>>> print_(system(buildout), end='')
>>> print_(system(buildout), end='')
Installing foo.
Installing foo.
...
@@ -2923,6 +3162,73 @@ def increment_on_command_line():
...
@@ -2923,6 +3162,73 @@ def increment_on_command_line():
recipe='zc.buildout:debug'
recipe='zc.buildout:debug'
"""
"""
def
bug_664539_simple_buildout
():
r"""
>>> write('buildout.cfg', '''
... [buildout]
... parts = escape
...
... [escape]
... recipe = zc.buildout:debug
... foo = $${nonexistent:option}
... ''')
>>> print_(system(buildout), end='')
Installing escape.
foo='${nonexistent:option}'
recipe='zc.buildout:debug'
"""
def
bug_664539_reference
():
r"""
>>> write('buildout.cfg', '''
... [buildout]
... parts = escape
...
... [escape]
... recipe = zc.buildout:debug
... foo = ${:bar}
... bar = $${nonexistent:option}
... ''')
>>> print_(system(buildout), end='')
Installing escape.
bar='${nonexistent:option}'
foo='${nonexistent:option}'
recipe='zc.buildout:debug'
"""
def
bug_664539_complex_buildout
():
r"""
>>> write('buildout.cfg', '''
... [buildout]
... parts = escape
...
... [escape]
... recipe = zc.buildout:debug
... foo = ${level1:foo}
...
... [level1]
... recipe = zc.buildout:debug
... foo = ${level2:foo}
...
... [level2]
... recipe = zc.buildout:debug
... foo = $${nonexistent:option}
... ''')
>>> print_(system(buildout), end='')
Installing level2.
foo='${nonexistent:option}'
recipe='zc.buildout:debug'
Installing level1.
foo='${nonexistent:option}'
recipe='zc.buildout:debug'
Installing escape.
foo='${nonexistent:option}'
recipe='zc.buildout:debug'
"""
def
test_constrained_requirement
():
def
test_constrained_requirement
():
"""
"""
zc.buildout.easy_install._constrained_requirement(constraint, requirement)
zc.buildout.easy_install._constrained_requirement(constraint, requirement)
...
@@ -3054,6 +3360,7 @@ def want_new_zcrecipeegg():
...
@@ -3054,6 +3360,7 @@ def want_new_zcrecipeegg():
Getting section egg.
Getting section egg.
Initializing section egg.
Initializing section egg.
Installing recipe zc.recipe.egg <2dev.
Installing recipe zc.recipe.egg <2dev.
Base installation request: 'zc.recipe.egg <2dev'
Getting distribution for 'zc.recipe.egg<2dev,>=2.0.6'.
Getting distribution for 'zc.recipe.egg<2dev,>=2.0.6'.
Error: Couldn't find a distribution for 'zc.recipe.egg<2dev,>=2.0.6'.
Error: Couldn't find a distribution for 'zc.recipe.egg<2dev,>=2.0.6'.
"""
"""
...
@@ -3262,6 +3569,209 @@ def test_buildout_doesnt_keep_adding_itself_to_versions():
...
@@ -3262,6 +3569,209 @@ def test_buildout_doesnt_keep_adding_itself_to_versions():
True
True
"""
"""
def
test_missing_setup_requires_fails
():
r"""
When not allow_picked_versions, ensure setup_requires dependencies
are not installed implicitly without respecting pinned versions.
>>> zc.buildout.easy_install.allow_picked_versions(False)
True
>>> mkdir('dest')
>>> ws = zc.buildout.easy_install.install(
... ['withsetuprequires'], 'dest',
... links=[link_server], index=link_server+'index/',
... versions = dict(withsetuprequires='0.1')) # doctest: +ELLIPSIS
Traceback (most recent call last):
...
subprocess.CalledProcessError: ...pip...wheel... non-zero exit status 1.
>>> zc.buildout.easy_install.allow_picked_versions(True)
False
"""
def
test_available_setup_requires_succeeds
():
r"""
When not allow_picked_versions, ensure setup_requires dependencies
can be installed first and passed explictly.
>>> import subprocess
>>> zc.buildout.easy_install.allow_picked_versions(False)
True
>>> mkdir('dest')
>>> ws = zc.buildout.easy_install.install(
... ['builddep'], 'dest',
... links=[link_server], index=link_server+'index/',
... versions = dict(builddep='0.1'))
>>> import os
>>> builddep_egg = [
... f for f in os.listdir('dest')
... if f.endswith('.egg')
... and f.startswith('builddep')
... ][0]
>>> builddep_path = os.path.join(os.getcwd(), 'dest', builddep_egg)
>>> os.environ['PYTHONEXTRAPATH'] = builddep_path
>>> _ = zc.buildout.easy_install.install(
... ['withsetuprequires'], 'dest',
... links=[link_server], index=link_server+'index/',
... versions = dict(withsetuprequires='0.1'))
>>> del os.environ['PYTHONEXTRAPATH']
>>> zc.buildout.easy_install.allow_picked_versions(True)
False
"""
def
test_missing_build_system_requires_fails
():
r"""
When not allow_picked_versions, ensure build-system.requires dependencies
are not installed implicitly without respecting pinned versions.
>>> zc.buildout.easy_install.allow_picked_versions(False)
True
>>> mkdir('dest')
>>> ws = zc.buildout.easy_install.install(
... ['withbuildsystemrequires'], 'dest',
... links=[link_server], index=link_server+'index/',
... versions = dict(withbuildsystemrequires='0.1'))
... # doctest: +ELLIPSIS
Traceback (most recent call last):
...
subprocess.CalledProcessError: ...pip...wheel... non-zero exit status 1.
>>> zc.buildout.easy_install.allow_picked_versions(True)
False
"""
def
test_available_build_system_requires_succeeds
():
r"""
When not allow_picked_versions, ensure build-system.requires
dependencies can be installed first and passed explictly.
>>> import subprocess
>>> zc.buildout.easy_install.allow_picked_versions(False)
True
>>> mkdir('dest')
>>> ws = zc.buildout.easy_install.install(
... ['builddep'], 'dest',
... links=[link_server], index=link_server+'index/',
... versions = dict(builddep='0.1'))
>>> import os
>>> builddep_egg = [
... f for f in os.listdir('dest')
... if f.endswith('.egg')
... and f.startswith('builddep')
... ][0]
>>> builddep_path = os.path.join(os.getcwd(), 'dest', builddep_egg)
>>> os.environ['PYTHONEXTRAPATH'] = builddep_path
>>> _ = zc.buildout.easy_install.install(
... ['withbuildsystemrequires'], 'dest',
... links=[link_server], index=link_server+'index/',
... versions = dict(withbuildsystemrequires='0.1'))
>>> del os.environ['PYTHONEXTRAPATH']
>>> zc.buildout.easy_install.allow_picked_versions(True)
False
"""
def
test_pin_setup_requires_without_setup_eggs
():
r"""
>>> write('buildout.cfg',
... '''
... [buildout]
... find-links = %(link_server)s
... index = %(link_server)s+'index/'
... allow-picked-versions = false
... parts = withsetuprequires
... [withsetuprequires]
... recipe = zc.recipe.egg
... egg = withsetuprequires
... [versions]
... withsetuprequires = 0.1
... ''' % globals())
>>> print(system(join('bin', 'buildout'))) # doctest: +ELLIPSIS
Installing withsetuprequires.
Getting distribution for 'withsetuprequires==0.1'.
error: subprocess-exited-with-error
<BLANKLINE>
× python setup.py egg_info did not run successfully.
│ exit code: 1
...
subprocess.CalledProcessError: ...pip...wheel... non-zero exit status 1.
<BLANKLINE>
"""
def
test_pin_setup_requires_with_setup_eggs
():
"""
>>> write('buildout.cfg',
... '''
... [buildout]
... find-links = %(link_server)s
... index = %(link_server)s+'index/'
... allow-picked-versions = false
... parts = withsetuprequires
... [withsetuprequires]
... recipe = zc.recipe.egg:custom
... egg = withsetuprequires
... setup-eggs = builddep
... [versions]
... withsetuprequires = 0.1
... builddep = 0.1
... ''' % globals())
>>> print(system(join('bin', 'buildout')))
Installing withsetuprequires.
Getting distribution for 'builddep==0.1'.
Got builddep 0.1.
<BLANKLINE>
"""
def
test_pin_build_system_requires_without_setup_eggs
():
r"""
>>> write('buildout.cfg',
... '''
... [buildout]
... find-links = %(link_server)s
... index = %(link_server)s+'index/'
... allow-picked-versions = false
... parts = withbuildsystemrequires
... [withbuildsystemrequires]
... recipe = zc.recipe.egg
... egg = withbuildsystemrequires
... [versions]
... withbuildsystemrequires = 0.1
... ''' % globals())
>>> print(system(join('bin', 'buildout'))) # doctest: +ELLIPSIS
Installing withbuildsystemrequires.
Getting distribution for 'withbuildsystemrequires==0.1'.
error: subprocess-exited-with-error
<BLANKLINE>
× Preparing metadata (pyproject.toml) did not run successfully.
│ exit code: 1
...
subprocess.CalledProcessError: ...pip...wheel... non-zero exit status 1.
<BLANKLINE>
"""
def
test_pin_build_system_requires_with_setup_eggs
():
"""
>>> write('buildout.cfg',
... '''
... [buildout]
... find-links = %(link_server)s
... index = %(link_server)s+'index/'
... allow-picked-versions = false
... parts = withbuildsystemrequires
... [withbuildsystemrequires]
... recipe = zc.recipe.egg:custom
... egg = withbuildsystemrequires
... setup-eggs = builddep
... [versions]
... withbuildsystemrequires = 0.1
... builddep = 0.1
... ''' % globals())
>>> print(system(join('bin', 'buildout')))
Installing withbuildsystemrequires.
Getting distribution for 'builddep==0.1'.
Got builddep 0.1.
<BLANKLINE>
"""
if
sys
.
platform
==
'win32'
:
if
sys
.
platform
==
'win32'
:
del
buildout_honors_umask
# umask on dohs is academic
del
buildout_honors_umask
# umask on dohs is academic
...
@@ -3291,23 +3801,25 @@ def buildout_txt_setup(test):
...
@@ -3291,23 +3801,25 @@ def buildout_txt_setup(test):
os
.
path
.
join
(
eggs
,
'zc.recipe.egg'
),
os
.
path
.
join
(
eggs
,
'zc.recipe.egg'
),
)
)
egg_parse
=
re
.
compile
(
r'([0-9a-zA-Z_.]+)-([0-9a-zA-Z_.]+)-py(\
d[.]
\d+)$'
egg_parse
=
re
.
compile
(
r'([0-9a-zA-Z_.]+)-([0-9a-zA-Z_.
+
]+)-py(\
d[.]
\d+)$'
).
match
).
match
def
makeNewRelease
(
project
,
ws
,
dest
,
version
=
'99.99'
):
def
makeNewRelease
(
project
,
ws
,
dest
,
version
=
'99.99'
):
dist
=
ws
.
find
(
pkg_resources
.
Requirement
.
parse
(
project
))
dist
=
ws
.
find
(
pkg_resources
.
Requirement
.
parse
(
project
))
eggname
,
oldver
,
pyver
=
egg_parse
(
dist
.
egg_name
()).
groups
()
eggname
,
oldver
,
pyver
=
egg_parse
(
dist
.
egg_name
()).
groups
()
dest
=
os
.
path
.
join
(
dest
,
"%s-%s-py%s.egg"
%
(
eggname
,
version
,
pyver
))
dest
=
os
.
path
.
join
(
dest
,
"%s-%s-py%s.egg"
%
(
eggname
,
version
,
pyver
))
if
os
.
path
.
isfile
(
dist
.
location
):
if
os
.
path
.
isfile
(
dist
.
location
):
shutil
.
copy
(
dist
.
location
,
dest
)
with
zipfile
.
ZipFile
(
dist
.
location
,
'r'
)
as
old_zip
:
zip
=
zipfile
.
ZipFile
(
dest
,
'a'
)
with
zipfile
.
ZipFile
(
dest
,
'w'
)
as
new_zip
:
zip
.
writestr
(
for
item
in
old_zip
.
infolist
():
'EGG-INFO/PKG-INFO'
,
data
=
old_zip
.
read
(
item
.
filename
)
((
zip
.
read
(
'EGG-INFO/PKG-INFO'
).
decode
(
'ISO-8859-1'
)
if
item
.
filename
==
'EGG-INFO/PKG-INFO'
:
).
replace
(
"Version: %s"
%
oldver
,
data
=
data
.
decode
(
"Version: %s"
%
version
)
'ISO-8859-1'
).
replace
(
"Version: %s"
%
oldver
,
"Version: %s"
%
version
).
encode
(
'ISO-8859-1'
)
).
encode
(
'ISO-8859-1'
)
)
new_zip
.
writestr
(
item
,
data
)
zip
.
close
()
elif
dist
.
location
.
endswith
(
'site-packages'
):
elif
dist
.
location
.
endswith
(
'site-packages'
):
os
.
mkdir
(
dest
)
os
.
mkdir
(
dest
)
shutil
.
copytree
(
shutil
.
copytree
(
...
@@ -3603,7 +4115,7 @@ def test_suite():
...
@@ -3603,7 +4115,7 @@ def test_suite():
),
),
doctest.DocFileSuite(
doctest.DocFileSuite(
'
download
.
txt
',
'
extends
-
cache
.
txt
',
'
download
.
txt
',
setUp=easy_install_SetUp,
setUp=easy_install_SetUp,
tearDown=zc.buildout.testing.buildoutTearDown,
tearDown=zc.buildout.testing.buildoutTearDown,
optionflags=doctest.NORMALIZE_WHITESPACE | doctest.ELLIPSIS,
optionflags=doctest.NORMALIZE_WHITESPACE | doctest.ELLIPSIS,
...
...
zc.recipe.egg_/setup.py
View file @
3b16f5af
...
@@ -14,7 +14,7 @@
...
@@ -14,7 +14,7 @@
"""Setup for zc.recipe.egg package
"""Setup for zc.recipe.egg package
"""
"""
version
=
'2.0.8.dev0'
version
=
'2.0.8.dev0
+slapos001
'
import
os
import
os
from
setuptools
import
setup
,
find_packages
from
setuptools
import
setup
,
find_packages
...
...
zc.recipe.egg_/src/zc/recipe/egg/README.rst
View file @
3b16f5af
...
@@ -9,6 +9,19 @@ eggs
...
@@ -9,6 +9,19 @@ eggs
requirement strings. Each string must be given on a separate
requirement strings. Each string must be given on a separate
line.
line.
patch-binary
The path to the patch executable.
EGGNAME-patches
A new-line separated list of patchs to apply when building.
EGGNAME-patch-options
Options to give to the patch program when applying patches.
EGGNAME-patch-revision
An integer to specify the revision (default is the number of
patches).
find-links
find-links
A list of URLs, files, or directories to search for distributions.
A list of URLs, files, or directories to search for distributions.
...
...
zc.recipe.egg_/src/zc/recipe/egg/api.rst
View file @
3b16f5af
...
@@ -97,14 +97,14 @@ of extra requirements to be included in the working set.
...
@@ -97,14 +97,14 @@ of extra requirements to be included in the working set.
We can see that the options were augmented with additional data
We can see that the options were augmented with additional data
computed by the egg recipe by looking at .installed.cfg:
computed by the egg recipe by looking at .installed.cfg:
>>> cat(sample_buildout, '.installed.cfg')
>>> cat(sample_buildout, '.installed.cfg')
# doctest: +ELLIPSIS
[buildout]
[buildout]
installed_develop_eggs = /sample-buildout/develop-eggs/sample.egg-link
installed_develop_eggs = /sample-buildout/develop-eggs/sample.egg-link
parts = sample-part
parts = sample-part
<BLANKLINE>
<BLANKLINE>
[sample-part]
[sample-part]
__buildout_installed__ =
__buildout_installed__ =
__buildout_signature__ = ...
__buildout_signature__ =
sample-... setuptools-... zc.buildout-... zc.recipe.egg-
...
_b = /sample-buildout/bin
_b = /sample-buildout/bin
_d = /sample-buildout/develop-eggs
_d = /sample-buildout/develop-eggs
_e = /sample-buildout/eggs
_e = /sample-buildout/eggs
...
...
zc.recipe.egg_/src/zc/recipe/egg/custom.py
View file @
3b16f5af
...
@@ -15,6 +15,7 @@
...
@@ -15,6 +15,7 @@
"""
"""
import
logging
import
logging
import
os
import
os
import
re
import
sys
import
sys
import
zc.buildout.easy_install
import
zc.buildout.easy_install
...
@@ -28,17 +29,19 @@ class Base:
...
@@ -28,17 +29,19 @@ class Base:
self
.
name
,
self
.
options
=
name
,
options
self
.
name
,
self
.
options
=
name
,
options
options
[
'_d'
]
=
buildout
[
'buildout'
][
'develop-eggs-directory'
]
options
[
'_d'
]
=
buildout
[
'buildout'
][
'develop-eggs-directory'
]
options
[
'_e'
]
=
buildout
[
'buildout'
][
'eggs-directory'
]
self
.
build_ext
=
build_ext
(
buildout
,
options
)
def
update
(
self
):
return
self
.
install
()
class
Custom
(
Base
):
environment_section
=
options
.
get
(
'environment'
)
if
environment_section
:
self
.
environment
=
buildout
[
environment_section
]
else
:
self
.
environment
=
{}
environment_data
=
list
(
self
.
environment
.
items
())
environment_data
.
sort
()
options
[
'_environment-data'
]
=
repr
(
environment_data
)
def
__init__
(
self
,
buildout
,
name
,
options
):
self
.
build_ext
=
build_ext
(
buildout
,
options
)
Base
.
__init__
(
self
,
buildout
,
name
,
options
)
links
=
options
.
get
(
'find-links'
,
links
=
options
.
get
(
'find-links'
,
buildout
[
'buildout'
].
get
(
'find-links'
))
buildout
[
'buildout'
].
get
(
'find-links'
))
...
@@ -54,45 +57,20 @@ class Custom(Base):
...
@@ -54,45 +57,20 @@ class Custom(Base):
options
[
'index'
]
=
index
options
[
'index'
]
=
index
self
.
index
=
index
self
.
index
=
index
environment_section
=
options
.
get
(
'environment'
)
if
environment_section
:
self
.
environment
=
buildout
[
environment_section
]
else
:
self
.
environment
=
{}
environment_data
=
list
(
self
.
environment
.
items
())
environment_data
.
sort
()
options
[
'_environment-data'
]
=
repr
(
environment_data
)
options
[
'_e'
]
=
buildout
[
'buildout'
][
'eggs-directory'
]
if
buildout
[
'buildout'
].
get
(
'offline'
)
==
'true'
:
self
.
install
=
lambda
:
()
self
.
newest
=
buildout
[
'buildout'
].
get
(
'newest'
)
==
'true'
self
.
newest
=
buildout
[
'buildout'
].
get
(
'newest'
)
==
'true'
def
install
(
self
):
options
=
self
.
options
distribution
=
options
.
get
(
'egg'
)
if
distribution
is
None
:
distribution
=
options
.
get
(
'eggs'
)
if
distribution
is
None
:
distribution
=
self
.
name
else
:
logger
.
warn
(
"The eggs option is deprecated. Use egg instead"
)
distribution
=
options
.
get
(
'egg'
,
options
.
get
(
'eggs'
,
self
.
name
)
def
install
(
self
):
).
strip
()
self
.
_set_environment
()
self
.
_set_environment
()
try
:
try
:
return
zc
.
buildout
.
easy_install
.
build
(
self
.
_install_setup_eggs
()
distribution
,
options
[
'_d'
],
self
.
build_ext
,
return
self
.
_install
()
self
.
links
,
self
.
index
,
sys
.
executable
,
[
options
[
'_e'
]],
newest
=
self
.
newest
,
)
finally
:
finally
:
self
.
_restore_environment
()
self
.
_restore_environment
()
def
update
(
self
):
return
self
.
install
()
def
_set_environment
(
self
):
def
_set_environment
(
self
):
self
.
_saved_environment
=
{}
self
.
_saved_environment
=
{}
...
@@ -114,6 +92,78 @@ class Custom(Base):
...
@@ -114,6 +92,78 @@ class Custom(Base):
except
KeyError
:
except
KeyError
:
pass
pass
def
_install_setup_eggs
(
self
):
options
=
self
.
options
setup_eggs
=
[
r
.
strip
()
for
r
in
options
.
get
(
'setup-eggs'
,
''
).
split
(
'
\
n
'
)
if
r
.
strip
()]
if
setup_eggs
:
ws
=
zc
.
buildout
.
easy_install
.
install
(
setup_eggs
,
options
[
'_e'
],
links
=
self
.
links
,
index
=
self
.
index
,
executable
=
sys
.
executable
,
path
=
[
options
[
'_d'
],
options
[
'_e'
]],
newest
=
self
.
newest
,
)
extra_path
=
os
.
pathsep
.
join
(
ws
.
entries
)
self
.
environment
[
'PYTHONEXTRAPATH'
]
=
os
.
environ
[
'PYTHONEXTRAPATH'
]
=
extra_path
def
_get_patch_dict
(
self
,
options
,
distribution
):
patch_dict
=
{}
global_patch_binary
=
options
.
get
(
'patch-binary'
,
'patch'
)
def
get_option
(
egg
,
key
,
default
):
return
options
.
get
(
'%s-%s'
%
(
egg
,
key
),
options
.
get
(
key
,
default
))
egg
=
re
.
sub
(
'[<>=].*'
,
''
,
distribution
)
patches
=
filter
(
lambda
x
:
x
,
map
(
lambda
x
:
x
.
strip
(),
get_option
(
egg
,
'patches'
,
''
).
splitlines
()))
patches
=
list
(
patches
)
if
not
patches
:
return
patch_dict
patch_options
=
get_option
(
egg
,
'patch-options'
,
'-p0'
).
split
()
patch_binary
=
get_option
(
egg
,
'patch-binary'
,
global_patch_binary
)
patch_revision
=
int
(
get_option
(
egg
,
'patch-revision'
,
len
(
patches
)))
patch_dict
[
egg
]
=
{
'patches'
:
patches
,
'patch_options'
:
patch_options
,
'patch_binary'
:
patch_binary
,
'patch_revision'
:
patch_revision
,
}
return
patch_dict
class
Custom
(
Base
):
def
__init__
(
self
,
buildout
,
name
,
options
):
Base
.
__init__
(
self
,
buildout
,
name
,
options
)
if
buildout
[
'buildout'
].
get
(
'offline'
)
==
'true'
:
self
.
_install
=
lambda
:
()
def
_install
(
self
):
options
=
self
.
options
distribution
=
options
.
get
(
'egg'
)
if
distribution
is
None
:
distribution
=
options
.
get
(
'eggs'
)
if
distribution
is
None
:
distribution
=
self
.
name
else
:
logger
.
warn
(
"The eggs option is deprecated. Use egg instead"
)
distribution
=
options
.
get
(
'egg'
,
options
.
get
(
'eggs'
,
self
.
name
)
).
strip
()
patch_dict
=
self
.
_get_patch_dict
(
options
,
distribution
)
return
zc
.
buildout
.
easy_install
.
build
(
distribution
,
options
[
'_d'
],
self
.
build_ext
,
self
.
links
,
self
.
index
,
sys
.
executable
,
[
options
[
'_e'
]],
newest
=
self
.
newest
,
patch_dict
=
patch_dict
,
)
class
Develop
(
Base
):
class
Develop
(
Base
):
...
@@ -122,7 +172,7 @@ class Develop(Base):
...
@@ -122,7 +172,7 @@ class Develop(Base):
options
[
'setup'
]
=
os
.
path
.
join
(
buildout
[
'buildout'
][
'directory'
],
options
[
'setup'
]
=
os
.
path
.
join
(
buildout
[
'buildout'
][
'directory'
],
options
[
'setup'
])
options
[
'setup'
])
def
install
(
self
):
def
_
install
(
self
):
options
=
self
.
options
options
=
self
.
options
return
zc
.
buildout
.
easy_install
.
develop
(
return
zc
.
buildout
.
easy_install
.
develop
(
options
[
'setup'
],
options
[
'_d'
],
self
.
build_ext
)
options
[
'setup'
],
options
[
'_d'
],
self
.
build_ext
)
...
...
zc.recipe.egg_/src/zc/recipe/egg/custom.rst
View file @
3b16f5af
...
@@ -20,6 +20,23 @@ rpath
...
@@ -20,6 +20,23 @@ rpath
A new-line separated list of directories to search for dynamic libraries
A new-line separated list of directories to search for dynamic libraries
at run time.
at run time.
setup-eggs
A new-line separated list of eggs that need to be installed
beforehand. It is useful to meet the `setup_requires` requirement.
patch-binary
The path to the patch executable.
patches
A new-line separated list of patchs to apply when building.
patch-options
Options to give to the patch program when applying patches.
patch-revision
An integer to specify the revision (default is the number of
patches).
define
define
A comma-separated list of names of C preprocessor variables to
A comma-separated list of names of C preprocessor variables to
define.
define.
...
@@ -434,8 +451,8 @@ Create a clean buildout.cfg w/o the checkenv recipe, and delete the recipe:
...
@@ -434,8 +451,8 @@ Create a clean buildout.cfg w/o the checkenv recipe, and delete the recipe:
... """ % dict(server=link_server))
... """ % dict(server=link_server))
>>> print_(system(buildout), end='') # doctest: +ELLIPSIS
>>> print_(system(buildout), end='') # doctest: +ELLIPSIS
Develop: '
/
sample
-
buildout
/
recipes
'
Develop: '
/
sample
-
buildout
/
recipes
'
Uninstalling checkenv.
Uninstalling extdemo.
Uninstalling extdemo.
Uninstalling checkenv.
Installing extdemo...
Installing extdemo...
>>> rmdir(sample_buildout, '
recipes
')
>>> rmdir(sample_buildout, '
recipes
')
...
@@ -463,6 +480,10 @@ rpath
...
@@ -463,6 +480,10 @@ rpath
A new-line separated list of directories to search for dynamic libraries
A new-line separated list of directories to search for dynamic libraries
at run time.
at run time.
setup-eggs
A new-line separated list of eggs that need to be installed
beforehand. It is useful to meet the `setup_requires` requirement.
define
define
A comma-separated list of names of C preprocessor variables to
A comma-separated list of names of C preprocessor variables to
define.
define.
...
@@ -499,6 +520,10 @@ swig-cpp
...
@@ -499,6 +520,10 @@ swig-cpp
swig-opts
swig-opts
List of SWIG command line options
List of SWIG command line options
environment
The name of a section with additional environment variables. The
environment variables are set before the egg is built.
To illustrate this, we'
ll
use
a
directory
containing
the
extdemo
To illustrate this, we'
ll
use
a
directory
containing
the
extdemo
example
from
the
earlier
section
:
example
from
the
earlier
section
:
...
...
zc.recipe.egg_/src/zc/recipe/egg/egg.py
View file @
3b16f5af
...
@@ -51,11 +51,44 @@ class Eggs(object):
...
@@ -51,11 +51,44 @@ class Eggs(object):
if
host
.
strip
()
!=
''
])
if
host
.
strip
()
!=
''
])
self
.
allow_hosts
=
allow_hosts
self
.
allow_hosts
=
allow_hosts
self
.
buildout_dir
=
b_options
[
'directory'
]
options
[
'eggs-directory'
]
=
b_options
[
'eggs-directory'
]
options
[
'eggs-directory'
]
=
b_options
[
'eggs-directory'
]
options
[
'_e'
]
=
options
[
'eggs-directory'
]
# backward compat.
options
[
'_e'
]
=
options
[
'eggs-directory'
]
# backward compat.
options
[
'develop-eggs-directory'
]
=
b_options
[
'develop-eggs-directory'
]
options
[
'develop-eggs-directory'
]
=
b_options
[
'develop-eggs-directory'
]
options
[
'_d'
]
=
options
[
'develop-eggs-directory'
]
# backward compat.
options
[
'_d'
]
=
options
[
'develop-eggs-directory'
]
# backward compat.
def
_get_patch_dict
(
self
,
options
,
egg
=
None
):
patch_dict
=
{}
global_patch_binary
=
options
.
get
(
'patch-binary'
,
'patch'
)
if
egg
:
egg
=
re
.
sub
(
'[<>=].*'
,
''
,
egg
)
egg_list
=
[
egg
]
else
:
egg_list
=
[
x
[:
-
8
]
for
x
in
options
.
keys
()
if
x
.
endswith
(
'-patches'
)]
def
get_option
(
egg
,
key
,
default
):
if
len
(
egg_list
)
==
1
:
return
options
.
get
(
'%s-%s'
%
(
egg
,
key
),
options
.
get
(
key
,
default
))
else
:
return
options
.
get
(
'%s-%s'
%
(
egg
,
key
),
default
)
for
egg
in
egg_list
:
patches
=
filter
(
lambda
x
:
x
,
map
(
lambda
x
:
x
.
strip
(),
get_option
(
egg
,
'patches'
,
''
).
splitlines
()))
patches
=
list
(
patches
)
if
not
patches
:
continue
patch_options
=
get_option
(
egg
,
'patch-options'
,
'-p0'
).
split
()
patch_binary
=
get_option
(
egg
,
'patch-binary'
,
global_patch_binary
)
patch_revision
=
int
(
get_option
(
egg
,
'patch-revision'
,
len
(
patches
)))
patch_dict
[
egg
]
=
{
'patches'
:
patches
,
'patch_options'
:
patch_options
,
'patch_binary'
:
patch_binary
,
'patch_revision'
:
patch_revision
,
}
return
patch_dict
def
working_set
(
self
,
extra
=
()):
def
working_set
(
self
,
extra
=
()):
"""Separate method to just get the working set
"""Separate method to just get the working set
...
@@ -77,6 +110,7 @@ class Eggs(object):
...
@@ -77,6 +110,7 @@ class Eggs(object):
distributions
=
orig_distributions
+
list
(
extra
),
distributions
=
orig_distributions
+
list
(
extra
),
develop_eggs_dir
=
options
[
'develop-eggs-directory'
],
develop_eggs_dir
=
options
[
'develop-eggs-directory'
],
eggs_dir
=
options
[
'eggs-directory'
],
eggs_dir
=
options
[
'eggs-directory'
],
buildout_dir
=
self
.
buildout_dir
,
offline
=
(
buildout_section
.
get
(
'offline'
)
==
'true'
),
offline
=
(
buildout_section
.
get
(
'offline'
)
==
'true'
),
newest
=
(
buildout_section
.
get
(
'newest'
)
==
'true'
),
newest
=
(
buildout_section
.
get
(
'newest'
)
==
'true'
),
links
=
self
.
links
,
links
=
self
.
links
,
...
@@ -98,6 +132,7 @@ class Eggs(object):
...
@@ -98,6 +132,7 @@ class Eggs(object):
distributions
,
distributions
,
eggs_dir
,
eggs_dir
,
develop_eggs_dir
,
develop_eggs_dir
,
buildout_dir
,
offline
=
False
,
offline
=
False
,
newest
=
True
,
newest
=
True
,
links
=
(),
links
=
(),
...
@@ -131,6 +166,7 @@ class Eggs(object):
...
@@ -131,6 +166,7 @@ class Eggs(object):
[
develop_eggs_dir
,
eggs_dir
]
[
develop_eggs_dir
,
eggs_dir
]
)
)
else
:
else
:
patch_dict
=
self
.
_get_patch_dict
(
self
.
options
)
ws
=
zc
.
buildout
.
easy_install
.
install
(
ws
=
zc
.
buildout
.
easy_install
.
install
(
distributions
,
eggs_dir
,
distributions
,
eggs_dir
,
links
=
links
,
links
=
links
,
...
@@ -138,9 +174,10 @@ class Eggs(object):
...
@@ -138,9 +174,10 @@ class Eggs(object):
path
=
[
develop_eggs_dir
],
path
=
[
develop_eggs_dir
],
newest
=
newest
,
newest
=
newest
,
allow_hosts
=
allow_hosts
,
allow_hosts
=
allow_hosts
,
allow_unknown_extras
=
allow_unknown_extras
)
allow_unknown_extras
=
allow_unknown_extras
,
patch_dict
=
patch_dict
)
ws
=
zc
.
buildout
.
easy_install
.
sort_working_set
(
ws
=
zc
.
buildout
.
easy_install
.
sort_working_set
(
ws
,
eggs_dir
,
develop_eggs_dir
ws
,
buildout_dir
,
eggs_dir
,
develop_eggs_dir
)
)
cache_storage
[
cache_key
]
=
ws
cache_storage
[
cache_key
]
=
ws
...
...
zc.recipe.egg_/src/zc/recipe/egg/patches.rst
0 → 100644
View file @
3b16f5af
Patching eggs before installation
---------------------------------
The SlapOS extensions of ``zc.recipe.egg`` supports applying patches before installing eggs.
The syntax is to use a version with the magic string ``SlapOSPatched`` plus the number of
patches to apply.
Let's use a patch for demoneeded egg:
>>> write(sample_buildout, 'demoneeded.patch',
... r"""diff -ru before/demoneeded-1.1/eggrecipedemoneeded.py after/demoneeded-1.1/eggrecipedemoneeded.py
... --- before/demoneeded-1.1/eggrecipedemoneeded.py 2020-09-08 09:27:36.000000000 +0200
... +++ after/demoneeded-1.1/eggrecipedemoneeded.py 2020-09-08 09:46:16.482243822 +0200
... @@ -1,3 +1,3 @@
... -y=1
... +y="patched demoneeded"
... def f():
... pass
... \ No newline at end of file
... """)
First, we install demoneeded directly:
>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... parts = demoneeded
...
... [demoneeded]
... recipe = zc.recipe.egg:eggs
... eggs = demoneeded
... find-links = %(server)s
... index = %(server)s/index
... demoneeded-patches =
... ./demoneeded.patch#4b8ad56711dd0d898a2b7957e9604079
... demoneeded-patch-options = -p2
...
... [versions]
... demoneeded = 1.1+SlapOSPatched001
... """ % dict(server=link_server))
When running buildout, we have a warning that a different version is installed, but that's not fatal.
>>> print_(system(buildout), end='')
Installing demoneeded.
patching file eggrecipedemoneeded.py
Installing demoneeded 1.1
Caused installation of a distribution:
demoneeded 1.1+slapospatched001
with a different version.
The installed egg has the slapospatched001 marker
>>> ls(sample_buildout, 'eggs')
d demoneeded-1.1+slapospatched001-pyN.N.egg
- setuptools-0.7-py2.3.egg
d zc.buildout-1.0-py2.3.egg
The code of the egg has been patched:
>>> import glob
>>> import os.path
>>> cat(glob.glob(os.path.join(sample_buildout, 'eggs', 'demoneeded-1.1+slapospatched001*', 'eggrecipedemoneeded.py'))[0])
y="patched demoneeded"
def f():
pass
Reset the state and also remove the installed egg
>>> remove('.installed.cfg')
>>> rmdir(glob.glob(os.path.join(sample_buildout, 'eggs', 'demoneeded-1.1+slapospatched001*'))[0])
In the previous example we applied patches to an egg installed directly, but
the same technique can be used to apply patches on eggs installed as dependencies.
In this example we install demo and apply a patch to demoneeded, which is a dependency to demo.
>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... parts = demo
...
... [demo]
... recipe = zc.recipe.egg
... eggs = demo
... find-links = %(server)s
... index = %(server)s/index
... demoneeded-patches =
... ./demoneeded.patch#4b8ad56711dd0d898a2b7957e9604079
... demoneeded-patch-options = -p2
...
... [versions]
... demoneeded = 1.1+SlapOSPatched001
... """ % dict(server=link_server))
When running buildout, we also have that warning that a different version is installed.
>>> print_(system(buildout), end='')
Installing demo.
Getting distribution for 'demo'.
Got demo 0.3.
patching file eggrecipedemoneeded.py
Installing demoneeded 1.1
Caused installation of a distribution:
demoneeded 1.1+slapospatched001
with a different version.
Generated script '/sample-buildout/bin/demo'.
The installed egg has the slapospatched001 marker
>>> ls(sample_buildout, 'eggs')
d demo-0.3-pyN.N.egg
d demoneeded-1.1+slapospatched001-pyN.N.egg
- setuptools-0.7-py2.3.egg
d zc.buildout-1.0-py2.3.egg
If we run the demo script we see the patch was applied:
>>> print_(system(join(sample_buildout, 'bin', 'demo')), end='')
3 patched demoneeded
zc.recipe.egg_/src/zc/recipe/egg/tests.py
View file @
3b16f5af
...
@@ -100,6 +100,26 @@ def test_suite():
...
@@ -100,6 +100,26 @@ def test_suite():
zc.buildout.testing.not_found,
zc.buildout.testing.not_found,
])
])
),
),
doctest.DocFileSuite(
'
patches
.
rst
',
setUp=setUp, tearDown=zc.buildout.testing.buildoutTearDown,
optionflags=doctest.NORMALIZE_WHITESPACE | doctest.ELLIPSIS,
checker=renormalizing.RENormalizing([
zc.buildout.testing.normalize_path,
zc.buildout.testing.normalize_endings,
zc.buildout.testing.normalize_script,
zc.buildout.testing.normalize_egg_py,
zc.buildout.tests.normalize_bang,
zc.buildout.tests.normalize_S,
zc.buildout.testing.not_found,
zc.buildout.testing.easy_install_deprecated,
(re.compile(r'
[
d
-
]
zc
.
buildout
(
-
\
S
+
)
?
[.]
egg
(
-
link
)
?
'),
'
zc
.
buildout
.
egg
'),
(re.compile(r'
[
d
-
]
setuptools
-
[
^-
]
+-
'), '
setuptools
-
X
-
'),
(re.compile(r'
eggs
\\\\
demo
'), '
eggs
/
demo
'),
(re.compile(r'
[
a
-
zA
-
Z
]:
\\\\
foo
\\\\
bar
'), '
/
foo
/
bar
'),
])
),
]
]
if not WINDOWS:
if not WINDOWS:
suites.append(
suites.append(
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment