metaprogramming and politics

Decentralize. Take the red pill.

PEP438 is live: speed up python package installs now!

with 17 comments

My “speed up pypi installs” PEP438 has been accepted and transition phase 1 is live: as a package maintainer you can speed up the installation for your packages for all your users now, with the click of a button: Login to and then go to urls for each of your packages, and specify that all release files are hosted from Or add explicit download urls with an MD5. Tools such as pip or easy_install will thus avoid any slow crawling of third party sites.

Many thanks to Carl Meyer who helped me write the PEP, and Donald Stufft for implementing most of it, and Richard Jones who accepted it today!   And thanks also to the distutils-sig discussion participants, in particular Phillip Eby and Marc-Andre Lemburg.


Written by holger krekel

May 19, 2013 at 7:49 am

Posted in metaprogramming

Tagged with , , ,

17 Responses

Subscribe to comments with RSS.

  1. Good to know!

    Can you clarify a bit what option to use in PyPI? I looked through my packages, but couldn’t find which option. (I typically release packages using `./ sdist register upload`)

    Diederik van der Boor

    May 19, 2013 at 9:08 am

  2. You can just go to the “urls” tab and change to “do not extract urls”. That’s all, no need to deal with the “file urls” at all to achieve the speed up for installers. See for an example of what is needed.

    holger krekel

    May 19, 2013 at 9:14 am

  3. Here’s a guide on how to update your packages .

  4. Just heard about this, that’s a really good initiative !
    Up till now, providing a comprehensive doc with links in long description used to punish our users…
    I did not find a “remove all” button, though, for the bunch of irrelevant links I have on my packages.

    Georges Racinet

    May 19, 2013 at 4:23 pm

    • Usually those URLs do no harm, though. pip/easy_install don’t consdier them unless they look like a package archive. The main thing that makes installs slow, are/were the links with “rel=’homepage'” or “rel=’download'” which caused pip/easy_install to crawl the link and look for more links there.

      holger krekel

      May 19, 2013 at 4:38 pm

      • Thanks for that precision


        May 20, 2013 at 2:31 pm

  5. IMHO, crawling Web pages, or HTML index pages to find download links is a bad design. I can’t understand why al this can’t rely on pure REST or XMLRPC based protocols.

    • That and the freaking case independency / mispelle correction relying on pip reloading the whole ‘simple/’ page (actually the main pypi has a 301 redirect, but mirrors may not).

      My guess is that it was a necessity for wide adoption of the PyPI a while ago. Not really sustainable in the long run, though.


      May 20, 2013 at 2:31 pm

      • Misspelled ‘misspell’ !


        May 20, 2013 at 2:33 pm

      • sure, if you read pep438 you find the historic reasons behind this. The PEP tries to address and move away from it in a backward-compatible way.

        holger krekel

        May 20, 2013 at 7:47 pm

    • Er, HTML + HTTP is a REST Api.

      • HTML has poor custom data semantic support, and mixing HTML with business oriented microformats overcharges the payload and makes it difficult and slow to build and parse.
        I was talking of *real* REST leveraging GET/POST/PUT/DELETE HTTP verbs and dedicated JSON or XML payloads.

        Gilles Lenfant

        September 21, 2013 at 8:15 am

  6. Holger, thanks you for writing PEP438! You just pushed one of my main objections against Python package installations (third party distribution sites) towards deprecation 🙂


    May 20, 2013 at 6:44 pm

  7. Great, updated the packages I have access to right away !

    Thanks Holger !

  8. Generally I don’t learn article on blogs, but I wish to say that this write-up very compelled me to take a look at and do it! Your writing style has been amazed me. Thank you, very great post.


    May 26, 2013 at 4:50 am

  9. What’s Going down i am new to this, I stumbled upon this I’ve discovered It positively
    useful and it has aided me out loads. I hope to give a contribution
    & help different customers like its helped me. Great job.

  10. When some one searches for his necessary thing, so he/she needs to be available that in detail, so that thing is maintained
    over here.


    September 21, 2013 at 7:31 am

Leave a Reply to peterodding Cancel reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: