Skip to content

Commit

Permalink
Merge pull request #1519 from dstufft/remove-dependency-links
Browse files Browse the repository at this point in the history
Remove Dependency Links - Needs Discussion
  • Loading branch information
dstufft committed Feb 26, 2014
2 parents 2ad8888 + 95ac4c1 commit da02f07
Show file tree
Hide file tree
Showing 18 changed files with 76 additions and 196 deletions.
7 changes: 7 additions & 0 deletions CHANGES.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,10 @@

* **BACKWARD INCOMPATIBLE** Dropped support for Python 3.1.

* Removed the deprecated support for dependency links and the
``--process-dependency-links`` flag that turned them on. For alternatives to
dependency links please see http://www.pip-installer.org/en/latest/installing.html


**1.5.4 (2014-02-21)**

Expand Down Expand Up @@ -80,6 +84,9 @@
* **BACKWARD INCOMPATIBLE** pip no longer respects dependency links by default.
Users may opt into respecting them again using ``--process-dependency-links``.

* **DEPRECATION** ``pip install --process-dependency-links`` and the ability to
use dependency links at all has been deprecated and will be removed in 1.6.

* **DEPRECATION** ``pip install --no-install`` and ``pip install
--no-download`` are now formally deprecated. See Issue #906 for discussion on
possible alternatives, or lack thereof, in future releases.
Expand Down
64 changes: 64 additions & 0 deletions docs/dependency_links.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
:orphan:

Dependency Links
================

In pip 1.5 processing dependency links was deprecated and it was removed
completely in pip 1.6. Dependency links supports a few different scenarios.


Depending on a Fork of a Project
--------------------------------

If you need to depend on a forked version of a project and it is for your own
personal use, than you can simply use a requirements.txt file that points to
the fork.

.. code::
# We need this fork instead of the foobar that exists on PyPI
git+https://github.com/example/foobar.git#egg=foobar
myproject==1.0 # myproject has a setup.py dependency on foobar
If you need to depend on a forked version of a project for something you want
to distribute to other people than you should rename the project and upload
it with a new name to PyPI. This way people can depend and install on it
normally.

Deploying Directly from VCS
---------------------------

If you're using dependency_links to essentially deploy a tree of dependencies
directly from VCS then you have two primary options. You can either setup
a requirements.txt that lists all of the repositories such as:

.. code::
# These are the locations of the git repos
git+https://github.com/example/foobar.git#egg=foobar
git+https://github.com/example/super.git#egg=super
git+https://github.com/example/duper.git#egg=duper
# This is my main package
myproject==1.0 # This depends on foobar, super, and duper from git repos
Or you can setup a private package index and point pip to use it instead. This
can be as simple as a directory full of packages exposed using Apache2 or Nginx
with an auto index, or can be as complex as a full blown index using software
such as `devpi <http://devpi.net/>`_.

If you're using a simple autoindex, then you can add it to pip using:

.. code:: console
$ pip install --find-links https://example.com/deploy/ myproject
Or if you're using a full blown index it could be:

.. code:: console
# Replace PyPI with the custom index
$ pip install --index-url https://example.com/simple/ myproject
# Add a custom index in addition to PyPI
$ pip install --extra-index-url https://example.com/simple/ myproject
35 changes: 2 additions & 33 deletions pip/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -213,7 +213,7 @@ def __init__(self, name, req, editable, comments=()):
_date_re = re.compile(r'-(20\d\d\d\d\d\d)$')

@classmethod
def from_dist(cls, dist, dependency_links, find_tags=False):
def from_dist(cls, dist, find_tags=False):
location = os.path.normcase(os.path.abspath(dist.location))
comments = []
from pip.vcs import vcs, get_src_requirement
Expand Down Expand Up @@ -241,38 +241,7 @@ def from_dist(cls, dist, dependency_links, find_tags=False):
req = dist.as_requirement()
specs = req.specs
assert len(specs) == 1 and specs[0][0] == '=='
version = specs[0][1]
ver_match = cls._rev_re.search(version)
date_match = cls._date_re.search(version)
if ver_match or date_match:
svn_backend = vcs.get_backend('svn')
if svn_backend:
svn_location = svn_backend().get_location(
dist,
dependency_links,
)
if not svn_location:
logger.warn(
'Warning: cannot find svn location for %s' % req)
comments.append(
'## FIXME: could not find svn URL in dependency_links '
'for this package:'
)
else:
comments.append(
'# Installing as editable to satisfy requirement %s:' %
req
)
if ver_match:
rev = ver_match.group(1)
else:
rev = '{%s}' % date_match.group(1)
editable = True
req = '%s@%s#egg=%s' % (
svn_location,
rev,
cls.egg_name(dist)
)

return cls(dist.project_name, req, editable, comments)

@staticmethod
Expand Down
10 changes: 0 additions & 10 deletions pip/cmdoptions.py
Original file line number Diff line number Diff line change
Expand Up @@ -252,15 +252,6 @@ def make(self):
help=SUPPRESS_HELP
)

# Remove after 1.5
process_dependency_links = OptionMaker(
"--process-dependency-links",
dest="process_dependency_links",
action="store_true",
default=False,
help="Enable the processing of dependency links.",
)

requirements = OptionMaker(
'-r', '--requirement',
dest='requirements',
Expand Down Expand Up @@ -374,6 +365,5 @@ def make(self):
no_allow_external,
allow_unsafe,
no_allow_unsafe,
process_dependency_links,
]
}
17 changes: 1 addition & 16 deletions pip/commands/freeze.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@
from pip.log import logger
from pip.basecommand import Command
from pip.util import get_installed_distributions
from pip._vendor import pkg_resources


class FreezeCommand(Command):
Expand Down Expand Up @@ -60,27 +59,13 @@ def run(self, options, args):
if skip_regex:
skip_match = re.compile(skip_regex)

dependency_links = []

f = sys.stdout

for dist in pkg_resources.working_set:
if dist.has_metadata('dependency_links.txt'):
dependency_links.extend(
dist.get_metadata_lines('dependency_links.txt')
)
for link in find_links:
if '#egg=' in link:
dependency_links.append(link)
for link in find_links:
f.write('-f %s\n' % link)
installations = {}
for dist in get_installed_distributions(local_only=local_only):
req = pip.FrozenRequirement.from_dist(
dist,
dependency_links,
find_tags=find_tags,
)
req = pip.FrozenRequirement.from_dist(dist, find_tags=find_tags)
installations[req.name] = req
if requirement:
req_f = open(requirement)
Expand Down
1 change: 0 additions & 1 deletion pip/commands/install.py
Original file line number Diff line number Diff line change
Expand Up @@ -198,7 +198,6 @@ def _build_package_finder(self, options, index_urls, session):
allow_unverified=options.allow_unverified,
allow_all_external=options.allow_all_external,
allow_all_prereleases=options.pre,
process_dependency_links=options.process_dependency_links,
session=session,
)

Expand Down
10 changes: 0 additions & 10 deletions pip/commands/list.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,6 @@ def _build_package_finder(self, options, index_urls, session):
allow_unverified=options.allow_unverified,
allow_all_external=options.allow_all_external,
allow_all_prereleases=options.pre,
process_dependency_links=options.process_dependency_links,
session=session,
)

Expand Down Expand Up @@ -116,18 +115,9 @@ def find_packages_latests_versions(self, options):
)
index_urls += options.mirrors

dependency_links = []
for dist in get_installed_distributions(
local_only=options.local, skip=self.skip):
if dist.has_metadata('dependency_links.txt'):
dependency_links.extend(
dist.get_metadata_lines('dependency_links.txt'),
)

session = self._build_session(options)

finder = self._build_package_finder(options, index_urls, session)
finder.add_dependency_links(dependency_links)

installed_packages = get_installed_distributions(
local_only=options.local,
Expand Down
1 change: 0 additions & 1 deletion pip/commands/wheel.py
Original file line number Diff line number Diff line change
Expand Up @@ -152,7 +152,6 @@ def run(self, options, args):
allow_unverified=options.allow_unverified,
allow_all_external=options.allow_all_external,
allow_all_prereleases=options.pre,
process_dependency_links=options.process_dependency_links,
session=session,
)

Expand Down
39 changes: 2 additions & 37 deletions pip/index.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,10 +38,9 @@ class PackageFinder(object):
def __init__(self, find_links, index_urls,
use_wheel=True, allow_external=[], allow_unverified=[],
allow_all_external=False, allow_all_prereleases=False,
process_dependency_links=False, session=None):
session=None):
self.find_links = find_links
self.index_urls = index_urls
self.dependency_links = []
self.cache = PageCache()
# These are boring links that have already been logged somehow:
self.logged_links = set()
Expand Down Expand Up @@ -73,28 +72,9 @@ def __init__(self, find_links, index_urls,
# Do we want to allow _all_ pre-releases?
self.allow_all_prereleases = allow_all_prereleases

# Do we process dependency links?
self.process_dependency_links = process_dependency_links
self._have_warned_dependency_links = False

# The Session we'll use to make requests
self.session = session or PipSession()

def add_dependency_links(self, links):
## FIXME: this shouldn't be global list this, it should only
## apply to requirements of the package that specifies the
## dependency_links value
## FIXME: also, we should track comes_from (i.e., use Link)
if self.process_dependency_links:
if not self._have_warned_dependency_links:
logger.deprecated(
"1.6",
"Dependency Links processing has been deprecated with an "
"accelerated time schedule and will be removed in pip 1.6",
)
self._have_warned_dependency_links = True
self.dependency_links.extend(links)

def _sort_locations(self, locations):
"""
Sort locations into "files" (archives) and "urls", and return
Expand Down Expand Up @@ -222,16 +202,11 @@ def mkurl_pypi_url(url):
posixpath.join(main_index_url.url, version)] + locations

file_locations, url_locations = self._sort_locations(locations)
_flocations, _ulocations = self._sort_locations(self.dependency_links)
file_locations.extend(_flocations)

# We trust every url that the user has given us whether it was given
# via --index-url or --find-links
locations = [Link(url, trusted=True) for url in url_locations]

# We explicitly do not trust links that came from dependency_links
locations.extend([Link(url) for url in _ulocations])

logger.debug('URLs to search for versions for %s:' % req)
for location in locations:
logger.debug('* %s' % location)
Expand Down Expand Up @@ -280,15 +255,6 @@ def mkurl_pypi_url(url):
)
finally:
logger.indent -= 2
dependency_versions = list(self._package_versions(
[Link(url) for url in self.dependency_links], req.name.lower()))
if dependency_versions:
logger.info(
'dependency_links found: %s' %
', '.join([
link.url for parsed, link, version in dependency_versions
])
)
file_versions = list(
self._package_versions(
[Link(url) for url in file_locations],
Expand All @@ -297,7 +263,6 @@ def mkurl_pypi_url(url):
)
if (not found_versions
and not page_versions
and not dependency_versions
and not file_versions):
logger.fatal(
'Could not find any downloads that satisfy the requirement'
Expand Down Expand Up @@ -334,7 +299,7 @@ def mkurl_pypi_url(url):
)
#this is an intentional priority ordering
all_versions = installed_version + file_versions + found_versions \
+ page_versions + dependency_versions
+ page_versions
applicable_versions = []
for (parsed_version, link, version) in all_versions:
if version not in req.req:
Expand Down
4 changes: 0 additions & 4 deletions pip/req/req_install.py
Original file line number Diff line number Diff line change
Expand Up @@ -446,10 +446,6 @@ def pkg_info(self):
p.feed(data or '')
return p.close()

@property
def dependency_links(self):
return self.egg_info_lines('dependency_links.txt')

_requirements_section_re = re.compile(r'\[(.*?)\]')

def requirements(self, extras=()):
Expand Down
4 changes: 0 additions & 4 deletions pip/req/req_set.py
Original file line number Diff line number Diff line change
Expand Up @@ -447,10 +447,6 @@ def prepare_files(self, finder, force_root_egg_info=False, bundle=False):

# sdists
elif not is_bundle:
## FIXME: shouldn't be globally added:
finder.add_dependency_links(
req_to_install.dependency_links
)
if (req_to_install.extras):
logger.notify(
"Installing extra requirements: %r" %
Expand Down
16 changes: 0 additions & 16 deletions pip/vcs/subversion.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
import os
import re
from pip.backwardcompat import urlparse
from pip.index import Link
from pip.util import rmtree, display_path, call_subprocess
from pip.log import logger
from pip.vcs import vcs, VersionControl
Expand Down Expand Up @@ -38,7 +37,6 @@ def get_info(self, location):
'Cannot determine URL of svn checkout %s' %
display_path(location)
)
logger.info('Output that cannot be parsed: \n%s' % output)
return None, None
url = match.group(1).strip()
match = _svn_revision_re.search(output)
Expand Down Expand Up @@ -101,20 +99,6 @@ def obtain(self, dest):
call_subprocess(
[self.cmd, 'checkout', '-q'] + rev_options + [url, dest])

def get_location(self, dist, dependency_links):
for url in dependency_links:
egg_fragment = Link(url).egg_fragment
if not egg_fragment:
continue
if '-' in egg_fragment:
## FIXME: will this work when a package has - in the name?
key = '-'.join(egg_fragment.split('-')[:-1]).lower()
else:
key = egg_fragment
if key == dist.key:
return url.split('#', 1)[0]
return None

def get_revision(self, location):
"""
Return the maximum revision for all files under a given location
Expand Down
1 change: 0 additions & 1 deletion tests/data/packages/LocalExtras/.gitignore

This file was deleted.

Empty file.
Loading

0 comments on commit da02f07

Please sign in to comment.