File osc-plugin-collab-0.104+30.obscpio of Package osc-plugin-collab

07070100000000000081A40000000000000000000000016548EB8C00000010000000000000000000000000000000000000002600000000osc-plugin-collab-0.104+30/.gitignore*.pyc
*.sw[nop]
07070100000001000081A40000000000000000000000016548EB8C0000058B000000000000000000000000000000000000002000000000osc-plugin-collab-0.104+30/NEWSVersion 0.90
============

This is the first version of the osc collab plugin. It has been renamed from
osc gnome. Here are a list of changes since the last release of osc gnome.

+ Features:
  - Rename to osc collab and do not refer to anything GNOME specific anywhere
  - Support delta in non-link packages
  - Remove potential trailing slash from packages passed as args for
    convenience when used with autocompletion
  - Make the config options work per apiurl
  - Make it possible to use more than one repo at the same time
  - Display against which repo the build is done
  - Make setup/update branch from the devel project
  - Take into account the version in devel project for todo/update
  - Use openSUSE:Factory by default instead of GNOME:Factory
  - Autodetect default repository for builds
  - Add --nodevelproject option for the relevant commands
  - Add --version command

+ Fixes:
  - Improve upstream tarball basename detection in when the basename of the
    upstream tarball is not in the URL, but in the query fields
  - Fix warning about tag in Source always appearing
  - Do not crash with osc from trunk
  - Better handling of update when package is already updated
  - Fix listreserved to not list reservations from all projects
  - Substitute macros in %define lines too
  - Remove old cache files
  - Fix parsing of empty list options in ~/.oscrc
  - Improve help message
  - Code cleanups
07070100000002000081A40000000000000000000000016548EB8C0000009D000000000000000000000000000000000000002200000000osc-plugin-collab-0.104+30/READMEThe osc collab plugin aims to make it easier to collaborate within the
Build Service.

See https://en.opensuse.org/openSUSE:Osc_Collab for more information.
07070100000003000081A40000000000000000000000016548EB8C000004BE000000000000000000000000000000000000002000000000osc-plugin-collab-0.104+30/TODO+ Make --xs the default for 'osc gnome t'
  For t: maybe check the version in the home project?
+ add 'osc gnome validate/check': make sure that committing is fine
+ osc collab todo:
  - list build failures (with the status api)
+ 'osc gnome update':
  - print information about rpmlint of this package
  - prefill the .changes with "Updated translations"?
  - check integrity with md5sum/sha1sum
  - integrate with spec-cleaner
+ 'osc gnome todoadmin':
  - use get_obs_build_results
  - handle the case where there's already a submission to oS:F but the
    submission is not from the latest revision of the package in G:F
+ 'osc gnome forward':
  - cancel old requests (automatically? ask?)
  - use 'osc meta prj GNOME:Factory' to see how to check for permissions.
+ 'osc gnome build/buildsubmit':
  - if the package has no explicit enable for architecture, we don't look
    at the project metadata and enable the build in the package. We should
    look at the project metadata.
+ 'osc gnome buildsubmit':
  - do nothing with --forward option when the user has not enough privilege to
    forward
+ Kill usage of http_GET, http_PUT (?)
+ Output an error if the project passed with --project does not exist (?)
07070100000004000081ED0000000000000000000000016548EB8C000011BA000000000000000000000000000000000000003100000000osc-plugin-collab-0.104+30/build-compare-analyze#!/usr/bin/env python3
# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os
import sys

import optparse
import re

def compare_build_output(filepath):
    compare_re = re.compile('^compare /.build.oldpackages/\S+-([^-]+)-(\d+).\d+.\S+.rpm /usr/src/packages/S?RPMS/\S+-([^-]+)-(\d+).\d+.\S+.rpm$')

    file = open(filepath)

    # read everything until we see the build-compare report header
    while True:
        line = file.readline()
        if line == '':
            break

        # this is not the beginning of the header
        if line[:-1] != '... comparing built packages with the former built':
            continue

        # we've found the beginning of the header, so let's read the whole
        # header
        line = file.readline()
        if line[:-1] != '/usr/lib/build/rpm-check.sh':
            # oops, this is not what we expected, so go back.
            file.seek(-len(line), os.SEEK_CUR)

        break

    different = False
    version_different = False
    output = ''

    # now let's analyze the real important lines
    while True:
        line = file.readline()
        if line == '':
            break

        # this is the end of build-compare
        if line[:-1] in ['... build is finished', 'compare validated built as indentical !']:
            break

        output = output + line

        match = compare_re.match(line[:-1])
        if match:
            oldver = match.group(1)
            oldrel = match.group(2)
            newver = match.group(3)
            newrel = match.group(4)
            if (oldver != newver) or (oldrel != newrel):
                version_different = True
        else:
            # this means we have output showing the difference
            different = True

    file.close()

    return (version_different, different, output)

def main(args):
    parser = optparse.OptionParser()

    parser.add_option("-f", "--file", dest="file",
                      help="build log file to read")
    parser.add_option("-o", "--output", dest="output",
                      default=False, help="output file to create if build-compare detected a non-version difference")

    (options, args) = parser.parse_args()

    if not options.file:
        print('No build log file.', file=sys.stderr)
        sys.exit(1)

    if not os.path.exists(options.file):
        print('Build log file "%s" does not exist.' % options.file, file=sys.stderr)
        sys.exit(1)

    if options.output and os.path.exists(options.output):
        os.unlink(options.output)

    (version_different, different, output) = compare_build_output(options.file)

    if not version_different and different:
        if options.output:
            out = open(options.output, 'w')
            out.write(output[:-1])
        else:
            print(output[:-1])

if __name__ == '__main__':
    try:
      main(sys.argv)
    except KeyboardInterrupt:
      pass
07070100000005000081A40000000000000000000000016548EB8C000244F4000000000000000000000000000000000000002900000000osc-plugin-collab-0.104+30/osc-collab.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2008-2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

from __future__ import print_function

import difflib
import locale
import re
import select
import shutil
import subprocess
import tarfile
import tempfile
import time
import urllib
from osc import core
from osc import conf

try:
    import configparser
    from http.client import BadStatusLine
    from urllib.error import HTTPError
    from urllib.parse import quote, urlparse
    from urllib.request import urlopen
except ImportError:
    # python 2.x
    import ConfigParser
    from httplib import BadStatusLine
    from urllib import quote
    from urllib2 import HTTPError, urlopen
    from urlparse import urlparse

try:
    import rpm
    have_rpm = True
except ImportError:
    have_rpm = False

from osc import cmdln
from osc import conf


OSC_COLLAB_VERSION = '0.104'

# This is a hack to have osc ignore the file we create in a package directory.
_osc_collab_helper_prefixes = [ 'osc-collab.', 'osc-gnome.' ]
_osc_collab_helpers = []
for suffix in [ 'NEWS', 'ChangeLog', 'configure', 'meson', 'meson_options' ]:
    for prefix in _osc_collab_helper_prefixes:
        _osc_collab_helpers.append(prefix + suffix)
for helper in _osc_collab_helpers:
    conf.DEFAULTS['exclude_glob'] += ' %s' % helper

_osc_collab_alias = 'collab'
_osc_collab_config_parser = None
_osc_collab_osc_conffile = None

def filedir_to_pac(f, progress_obj=None):
    """Takes a working copy path, or a path to a file inside a working copy,
    and returns a Package object instance
    If the argument was a filename, add it onto the "todo" list of the Package """
    if os.path.isdir(f):
        wd = f
        p = Package(wd, progress_obj=progress_obj)
    else:
        wd = os.path.dirname(f) or os.curdir
        p = Package(wd, progress_obj=progress_obj)
        p.todo = [ os.path.basename(f) ]
    return p

class OscCollabError(Exception):
    def __init__(self, value):
        self.msg = value

    def __str__(self):
        return repr(self.msg)


class OscCollabWebError(OscCollabError):
    pass

class OscCollabDownloadError(OscCollabError):
    pass

class OscCollabDiffError(OscCollabError):
    pass

class OscCollabCompressError(OscCollabError):
    pass


def _collab_exception_print(e, message = ''):
    if message == None:
        message = ''

    if hasattr(e, 'msg'):
        print(message + e.msg, file=sys.stderr)
    elif str(e) != '':
        print(message + str(e), file=sys.stderr)
    else:
        print(message + e.__class__.__name__, file=sys.stderr)


#######################################################################


class OscCollabReservation:

    project = None
    package = None
    user = None

    def __init__(self, project = None, package = None, user = None, node = None):
        if node is None:
            self.project = project
            self.package = package
            self.user = user
        else:
            self.project = node.get('project')
            self.package = node.get('package')
            self.user = node.get('user')


    def __len__(self):
        return 3


    def __getitem__(self, key):
        if not type(key) == int:
            raise TypeError

        if key == 0:
            return self.project
        elif key == 1:
            return self.package
        elif key == 2:
            return self.user
        else:
            raise IndexError


    def is_relevant(self, projects, package):
        if self.project not in projects:
            return False
        if self.package != package:
            return False
        return True


#######################################################################


class OscCollabComment:

    project = None
    package = None
    user = None
    comment = None
    firstline = None

    def __init__(self, project = None, package = None, date = None, user = None, comment = None, node = None):
        if node is None:
            self.project = project
            self.package = package
            self.date = date
            self.user = user
            self.comment = comment
        else:
            self.project = node.get('project')
            self.package = node.get('package')
            self.date = node.get('date')
            self.user = node.get('user')
            self.comment = node.text
        if self.comment is None:
            self.firstline = None
        else:
            lines = self.comment.split('\n')
            self.firstline = lines[0]
            if len(lines) > 1:
                self.firstline += ' [...]'


    def __len__(self):
        return 4


    def __getitem__(self, key):
        if not type(key) == int:
            raise TypeError

        if key == 0:
            return self.project
        elif key == 1:
            return self.package
        elif key == 2:
            return self.user
        elif key == 3:
            return self.firstline
        else:
            raise IndexError


    def is_relevant(self, projects, package):
        if self.project not in projects:
            return False
        if self.package != package:
            return False
        return True


    def indent(self, spaces = '  '):
        lines = self.comment.split('\n')
        lines = [ spaces + line for line in lines ]
        return '\n'.join(lines)


#######################################################################


class OscCollabRequest():

    req_id = -1
    type = None
    source_project = None
    source_package = None
    source_rev = None
    dest_project = None
    dest_package = None
    state = None
    by = None
    at = None
    description = None

    def __init__(self, node):
        self.req_id = int(node.get('id'))

        # we only care about the first action here
        action = node.find('action')
        if action is None:
            action = node.find('submit') # for old style requests

        type = action.get('type', 'submit')

        subnode = action.find('source')
        if subnode is not None:
            self.source_project = subnode.get('project')
            self.source_package = subnode.get('package')
            self.source_rev = subnode.get('rev')

        subnode = action.find('target')
        if subnode is not None:
            self.target_project = subnode.get('project')
            self.target_package = subnode.get('package')

        subnode = node.find('state')
        if subnode is not None:
            self.state = subnode.get('name')
            self.by = subnode.get('who')
            self.at = subnode.get('when')

        subnode = node.find('description')
        if subnode is not None:
            self.description = subnode.text


#######################################################################


class OscCollabProject(dict):

    def __init__(self, node):
        self.name = node.get('name')
        self.parent = node.get('parent')
        self.ignore_upstream = node.get('ignore_upstream') == 'true'
        self.missing_packages = []


    def strip_internal_links(self):
        to_rm = []
        for package in self.values():
            if package.parent_project == self.name:
                to_rm.append(package.name)
        for name in to_rm:
            del self[name]


    def is_toplevel(self):
        return self.parent in [ None, '' ]


    def __eq__(self, other):
        return self.name == other.name


    def __ne__(self, other):
        return not self.__eq__(other)


    def __lt__(self, other):
        return self.name < other.name


    def __le__(self, other):
        return self.__eq__(other) or self.__lt__(other)


    def __gt__(self, other):
        return other.__lt__(self)


    def __ge__(self, other):
        return other.__eq__(self) or other.__lt__(self)


#######################################################################


class OscCollabPackage:

    def __init__(self, node, project):
        self.name = None
        self.version = None
        self.parent_project = None
        self.parent_package = None
        self.parent_version = None
        self.devel_project = None
        self.devel_package = None
        self.devel_version = None
        self.upstream_version = None
        self.upstream_url = None
        self.is_link = False
        self.has_delta = False
        self.error = None
        self.error_details = None

        self.project = project

        if node is not None:
            self.name = node.get('name')

            parent = node.find('parent')
            if parent is not None:
                self.parent_project = parent.get('project')
                self.parent_package = parent.get('package')

            devel = node.find('devel')
            if devel is not None:
                self.devel_project = devel.get('project')
                self.devel_package = devel.get('package')

            version = node.find('version')
            if version is not None:
                self.version = version.get('current')
                if not project or not project.ignore_upstream:
                    self.upstream_version = version.get('upstream')
                self.parent_version = version.get('parent')
                self.devel_version = version.get('devel')

            if not project or not project.ignore_upstream:
                upstream = node.find('upstream')
                if upstream is not None:
                    url = upstream.find('url')
                    if url is not None:
                        self.upstream_url = url.text

            link = node.find('link')
            if link is not None:
                self.is_link = True
                if link.get('delta') == 'true':
                    self.has_delta = True

            delta = node.find('delta')
            if delta is not None:
                self.has_delta = True

            error = node.find('error')
            if error is not None:
                self.error = error.get('type')
                self.error_details = error.text

        # Reconstruct some data that we can deduce from the XML
        if project is not None and self.is_link and not self.parent_project:
            self.parent_project = project.parent
        if self.parent_project and not self.parent_package:
            self.parent_package = self.name
        if self.devel_project and not self.devel_package:
            self.devel_package = self.name


    def _compare_versions_a_gt_b(self, a, b):
        if have_rpm:
            # We're not really interested in the epoch or release parts of the
            # complete version because they're not relevant when comparing to
            # upstream version
            return rpm.labelCompare((None, a, '1'), (None, b, '1')) > 0

        split_a = a.split('.')
        split_b = b.split('.')

        # the two versions don't have the same format; we don't know how to
        # handle this
        if len(split_a) != len(split_b):
            return a > b

        for i in range(len(split_a)):
            try:
                int_a = int(split_a[i])
                int_b = int(split_b[i])
                if int_a > int_b:
                    return True
                if int_b > int_a:
                    return False
            except ValueError:
                if split_a[i] > split_b[i]:
                    return True
                if split_b[i] > split_a[i]:
                    return False

        return False


    def parent_more_recent(self):
        if not self.parent_version:
            return False
        return self._compare_versions_a_gt_b(self.parent_version, self.version)


    def needs_update(self):
        # empty upstream version, or upstream version meaning openSUSE is
        # upstream
        if self.upstream_version in [ None, '', '--' ]:
            return False

        if self.parent_version in [ None, '', '--' ]:
            return True

        if self.version in [ None, '', '--' ]:
            return True

        return self._compare_versions_a_gt_b(self.upstream_version, self.parent_version) and self._compare_versions_a_gt_b(self.upstream_version, self.version)


    def devel_needs_update(self):
        # if there's no devel project, then it's as if it were needing an update
        if not self.devel_project:
            return True

        # empty upstream version, or upstream version meaning openSUSE is
        # upstream
        if self.upstream_version in [ None, '', '--' ]:
            return False

        return self._compare_versions_a_gt_b(self.upstream_version, self.devel_version)


    def is_broken_link(self):
        return self.error in [ 'not-in-parent', 'need-merge-with-parent' ]


    def __eq__(self, other):
        return self.name == other.name and self.project and other.project and self.project.name == other.project.name


    def __ne__(self, other):
        return not self.__eq__(other)


    def __lt__(self, other):
        if not self.project or not self.project.name:
            if other.project and other.project.name:
                return True
            else:
                return self.name < other.name

        if self.project.name == other.project.name:
            return self.name < other.name

        return self.project.name < other.project.name


    def __le__(self, other):
        return self.__eq__(other) or self.__lt__(other)


    def __gt__(self, other):
        return other.__lt__(self)


    def __ge__(self, other):
        return other.__eq__(self) or other.__lt__(self)


#######################################################################


class OscCollabObs:

    apiurl = None


    @classmethod
    def init(cls, apiurl):
        cls.apiurl = apiurl


    @classmethod
    def get_meta(cls, project):
        what = 'metadata of packages in %s' % project

        # download the data (cache for 2 days)
        url = makeurl(cls.apiurl, ['search', 'package'], ['match=%s' % quote('@project=\'%s\'' % project)])
        filename = '%s-meta.obs' % project
        max_age_minutes = 3600 * 24 * 2

        return OscCollabCache.get_from_obs(url, filename, max_age_minutes, what)


    @classmethod
    def get_build_results(cls, project):
        what = 'build results of packages in %s' % project

        # download the data (cache for 2 hours)
        url = makeurl(cls.apiurl, ['build', project, '_result'])
        filename = '%s-build-results.obs' % project
        max_age_minutes = 3600 * 2

        return OscCollabCache.get_from_obs(url, filename, max_age_minutes, what)


    @classmethod
    def _get_request_list_url(cls, project, package, type, what):
        match = '(state/@name=\'new\'%20or%20state/@name=\'review\')'
        match += '%20and%20'
        match += 'action/%s/@project=\'%s\'' % (type, quote(project))
        if package:
            match += '%20and%20'
            match += 'action/%s/@package=\'%s\'' % (type, quote(package))

        return makeurl(cls.apiurl, ['search', 'request'], ['match=%s' % match])


    @classmethod
    def _parse_request_list_internal(cls, f, what):
        requests = []

        try:
            collection = ET.parse(f).getroot()
        except SyntaxError as e:
            print('Cannot parse %s: %s' % (what, e.msg), file=sys.stderr)
            return requests

        for node in collection.findall('request'):
            requests.append(OscCollabRequest(node))

        return requests


    @classmethod
    def _get_request_list_no_cache(cls, project, package, type, what):
        url = cls._get_request_list_url(project, package, type, what)

        try:
            fin = http_GET(url)
        except HTTPError as e:
            print('Cannot get %s: %s' % (what, e.msg), file=sys.stderr)
            return []

        requests = cls._parse_request_list_internal(fin, what)

        fin.close()

        return requests


    @classmethod
    def _get_request_list_with_cache(cls, project, package, type, what):
        url = cls._get_request_list_url(project, package, type, what)
        if url is None:
            return []

        # download the data (cache for 10 minutes)
        if package:
            filename = '%s-%s-requests-%s.obs' % (project, package, type)
        else:
            filename = '%s-requests-%s.obs' % (project, type)
        max_age_minutes = 60 * 10

        file = OscCollabCache.get_from_obs(url, filename, max_age_minutes, what)

        if not file or not os.path.exists(file):
            return []

        return cls._parse_request_list_internal(file, what)


    @classmethod
    def _get_request_list(cls, project, package, type, use_cache):
        if package:
            what_helper = '%s/%s' % (project, package)
        else:
            what_helper = project
        if type == 'source':
            what = 'list of requests from %s' % what_helper
        elif type == 'target':
            what = 'list of requests to %s' % what_helper
        else:
            print('Internal error when getting request list: unknown type \"%s\".' % type, file=sys.stderr)
            return None

        if use_cache:
            return cls._get_request_list_with_cache(project, package, type, what)
        else:
            return cls._get_request_list_no_cache(project, package, type, what)


    @classmethod
    def get_request_list_from(cls, project, package=None, use_cache=True):
        return cls._get_request_list(project, package, 'source', use_cache)


    @classmethod
    def get_request_list_to(cls, project, package=None, use_cache=True):
        return cls._get_request_list(project, package, 'target', use_cache)


    @classmethod
    def get_request(cls, id):
        url = makeurl(cls.apiurl, ['request', id])

        try:
            fin = http_GET(url)
        except HTTPError as e:
            print('Cannot get request %s: %s' % (id, e.msg), file=sys.stderr)
            return None

        try:
            node = ET.parse(fin).getroot()
        except SyntaxError as e:
            fin.close()
            print('Cannot parse request %s: %s' % (id, e.msg), file=sys.stderr)
            return None

        fin.close()

        return OscCollabRequest(node)


    @classmethod
    def change_request_state(cls, id, new_state, message, superseded_by=None):
        if new_state != 'superseded':
            result = change_request_state(cls.apiurl, id, new_state, message)
        else:
            result = change_request_state(cls.apiurl, id, new_state, message, supersed=superseded_by)

        return result == 'ok'


    @classmethod
    def supersede_old_requests(cls, user, project, package, new_request_id):
        requests_to = cls.get_request_list_to(project, package, use_cache=False)
        old_ids = [ request.req_id for request in requests_to if request.by == user and str(request.req_id) != new_request_id ]
        for old_id in old_ids:
            cls.change_request_state(str(old_id), 'superseded', 'superseded by %s' % new_request_id, superseded_by=new_request_id)
        return old_ids


    @classmethod
    def branch_package(cls, project, package, no_devel_project = False):
        query = { 'cmd': 'branch' }
        if no_devel_project:
            query['ignoredevel'] = '1'

        url = makeurl(cls.apiurl, ['source', project, package], query = query)

        try:
            fin = http_POST(url)
        except HTTPError as e:
            print('Cannot branch package %s: %s' % (package, e.msg), file=sys.stderr)
            return (None, None)

        try:
            node = ET.parse(fin).getroot()
        except SyntaxError as e:
            fin.close()
            print('Cannot branch package %s: %s' % (package, e.msg), file=sys.stderr)
            return (None, None)

        fin.close()

        branch_project = None
        branch_package = None

        for data in node.findall('data'):
            name = data.get('name')
            if not name:
                continue
            if name == 'targetproject' and data.text:
                branch_project = data.text
            elif name == 'targetpackage' and data.text:
                branch_package = data.text

        return (branch_project, branch_package)


#######################################################################


class OscCollabApi:

    _api_url = 'https://osc-collab.opensuse.org/api'
    _supported_api = '0.2'
    _supported_api_major = '0'

    @classmethod
    def init(cls, collapiurl = None):
        if collapiurl:
            cls._api_url = collapiurl

    @classmethod
    def _append_data_to_url(cls, url, data):
        if url.find('?') != -1:
            return '%s&%s' % (url, data)
        else:
            return '%s?%s' % (url, data)


    @classmethod
    def _get_api_url_for(cls, api, project = None, projects = None, package = None, need_package_for_multiple_projects = True):
        if not project and len(projects) == 1:
            project = projects[0]

        items = [ cls._api_url, api ]
        if project:
            items.append(project)
        if package:
            items.append(package)
        url = '/'.join(items)

        if not project and (not need_package_for_multiple_projects or package) and projects:
            data = urlencode({'version': cls._supported_api, 'project': projects}, True)
            url = cls._append_data_to_url(url, data)
        else:
            data = urlencode({'version': cls._supported_api})
            url = cls._append_data_to_url(url, data)

        return url


    @classmethod
    def _get_info_url(cls, project = None, projects = None, package = None):
        return cls._get_api_url_for('info', project, projects, package, True)


    @classmethod
    def _get_reserve_url(cls, project = None, projects = None, package = None):
        return cls._get_api_url_for('reserve', project, projects, package, False)


    @classmethod
    def _get_comment_url(cls, project = None, projects = None, package = None):
        return cls._get_api_url_for('comment', project, projects, package, False)


    @classmethod
    def _get_root_for_url(cls, url, error_prefix, post_data = None, cache_file = None, cache_age = 10):
        if post_data and type(post_data) != dict:
            raise OscCollabWebError('%s: Internal error when posting data' % error_prefix)

        try:
            if cache_file and not post_data:
                fd = OscCollabCache.get_url_fd_with_cache(url, cache_file, cache_age)
            else:
                if post_data:
                    data = urlencode(post_data).encode('utf-8')
                else:
                    data = None
                fd = urlopen(url, data)
        except HTTPError as e:
            raise OscCollabWebError('%s: %s' % (error_prefix, e.msg))

        try:
            root = ET.parse(fd).getroot()
        except SyntaxError:
            raise OscCollabWebError('%s: malformed reply from server.' % error_prefix)

        if root.tag != 'api' or not root.get('version'):
            raise OscCollabWebError('%s: invalid reply from server.' % error_prefix)

        version = root.get('version')
        version_items = version.split('.')
        for item in version_items:
            try:
                int(item)
            except ValueError:
                raise OscCollabWebError('%s: unknown protocol used by server.' % error_prefix)
        protocol = int(version_items[0])
        if int(version_items[0]) != int(cls._supported_api_major):
            raise OscCollabWebError('%s: unknown protocol used by server.' % error_prefix)

        result = root.find('result')
        if result is None or not result.get('ok'):
            raise OscCollabWebError('%s: reply from server with no result summary.' % error_prefix)

        if result.get('ok') != 'true':
            if result.text:
                raise OscCollabWebError('%s: %s' % (error_prefix, result.text))
            else:
                raise OscCollabWebError('%s: unknown error in the request.' % error_prefix)

        return root


    @classmethod
    def _meta_append_no_devel_project(cls, url, no_devel_project):
        if not no_devel_project:
            return url

        data = urlencode({'ignoredevel': 'true'})
        return cls._append_data_to_url(url, data)


    @classmethod
    def _parse_reservation_node(cls, node):
        reservation = OscCollabReservation(node = node)
        if not reservation.project or not reservation.package:
            return None

        return reservation


    @classmethod
    def get_reserved_packages(cls, projects):
        url = cls._get_reserve_url(projects = projects)
        root = cls._get_root_for_url(url, 'Cannot get list of reserved packages')

        reserved_packages = []
        for reservation in root.findall('reservation'):
            item = cls._parse_reservation_node(reservation)
            if item is None or not item.user:
                continue
            reserved_packages.append(item)

        return reserved_packages


    @classmethod
    def is_package_reserved(cls, projects, package, no_devel_project = False):
        '''
            Only returns something if the package is really reserved.
        '''
        url = cls._get_reserve_url(projects = projects, package = package)
        url = cls._meta_append_no_devel_project(url, no_devel_project)
        root = cls._get_root_for_url(url, 'Cannot look if package %s is reserved' % package)

        for reservation in root.findall('reservation'):
            item = cls._parse_reservation_node(reservation)
            if item is None or (no_devel_project and not item.is_relevant(projects, package)):
                continue
            if not item.user:
                # We continue to make sure there are no other relevant entries
                continue
            return item

        return None


    @classmethod
    def reserve_package(cls, projects, package, username, no_devel_project = False):
        url = cls._get_reserve_url(projects = projects, package = package)
        data = urlencode({'cmd': 'set', 'user': username})
        url = cls._append_data_to_url(url, data)
        url = cls._meta_append_no_devel_project(url, no_devel_project)
        root = cls._get_root_for_url(url, 'Cannot reserve package %s' % package)

        for reservation in root.findall('reservation'):
            item = cls._parse_reservation_node(reservation)
            if not item or (no_devel_project and not item.is_relevant(projects, package)):
                continue
            if not item.user:
                raise OscCollabWebError('Cannot reserve package %s: unknown error' % package)
            if item.user != username:
                raise OscCollabWebError('Cannot reserve package %s: already reserved by %s' % (package, item.user))
            return item


    @classmethod
    def unreserve_package(cls, projects, package, username, no_devel_project = False):
        url = cls._get_reserve_url(projects = projects, package = package)
        data = urlencode({'cmd': 'unset', 'user': username})
        url = cls._append_data_to_url(url, data)
        url = cls._meta_append_no_devel_project(url, no_devel_project)
        root = cls._get_root_for_url(url, 'Cannot unreserve package %s' % package)

        for reservation in root.findall('reservation'):
            item = cls._parse_reservation_node(reservation)
            if not item or (no_devel_project and not item.is_relevant(projects, package)):
                continue
            if item.user:
                raise OscCollabWebError('Cannot unreserve package %s: reserved by %s' % (package, item.user))
            return item


    @classmethod
    def _parse_comment_node(cls, node):
        comment = OscCollabComment(node = node)
        if not comment.project or not comment.package:
            return None

        return comment


    @classmethod
    def get_commented_packages(cls, projects):
        url = cls._get_comment_url(projects = projects)
        root = cls._get_root_for_url(url, 'Cannot get list of commented packages')

        commented_packages = []
        for comment in root.findall('comment'):
            item = cls._parse_comment_node(comment)
            if item is None or not item.user or not item.comment:
                continue
            commented_packages.append(item)

        return commented_packages


    @classmethod
    def get_package_comment(cls, projects, package, no_devel_project = False):
        '''
            Only returns something if the package is really commented.
        '''
        url = cls._get_comment_url(projects = projects, package = package)
        url = cls._meta_append_no_devel_project(url, no_devel_project)
        root = cls._get_root_for_url(url, 'Cannot look if package %s is commented' % package)

        for comment in root.findall('comment'):
            item = cls._parse_comment_node(comment)
            if item is None or (no_devel_project and not item.is_relevant(projects, package)):
                continue
            if not item.user or not item.comment:
                # We continue to make sure there are no other relevant entries
                continue
            return item

        return None


    @classmethod
    def set_package_comment(cls, projects, package, username, comment, no_devel_project = False):
        if not comment:
            raise OscCollabWebError('Cannot set comment on package %s: empty comment' % package)

        url = cls._get_comment_url(projects = projects, package = package)
        data = urlencode({'cmd': 'set', 'user': username})
        url = cls._append_data_to_url(url, data)
        url = cls._meta_append_no_devel_project(url, no_devel_project)
        root = cls._get_root_for_url(url, 'Cannot set comment on package %s' % package, post_data = {'comment': comment})

        for comment in root.findall('comment'):
            item = cls._parse_comment_node(comment)
            if not item or (no_devel_project and not item.is_relevant(projects, package)):
                continue
            if not item.user:
                raise OscCollabWebError('Cannot set comment on package %s: unknown error' % package)
            if item.user != username:
                raise OscCollabWebError('Cannot set comment on package %s: already commented by %s' % (package, item.user))
            return item


    @classmethod
    def unset_package_comment(cls, projects, package, username, no_devel_project = False):
        url = cls._get_comment_url(projects = projects, package = package)
        data = urlencode({'cmd': 'unset', 'user': username})
        url = cls._append_data_to_url(url, data)
        url = cls._meta_append_no_devel_project(url, no_devel_project)
        root = cls._get_root_for_url(url, 'Cannot unset comment on package %s' % package)

        for comment in root.findall('comment'):
            item = cls._parse_comment_node(comment)
            if not item or (no_devel_project and not item.is_relevant(projects, package)):
                continue
            if item.user:
                raise OscCollabWebError('Cannot unset comment on package %s: commented by %s' % (package, item.user))
            return item


    @classmethod
    def _parse_package_node(cls, node, project):
        package = OscCollabPackage(node, project)
        if not package.name:
            return None

        if project is not None:
            project[package.name] = package

        return package


    @classmethod
    def _parse_missing_package_node(cls, node, project):
        name = node.get('name')
        parent_project = node.get('parent_project')
        parent_package = node.get('parent_package') or name

        if not name or not parent_project:
            return

        project.missing_packages.append((name, parent_project, parent_package))


    @classmethod
    def _parse_project_node(cls, node):
        project = OscCollabProject(node)
        if not project.name:
            return None

        for package in node.findall('package'):
            cls._parse_package_node(package, project)

        missing = node.find('missing')
        if missing is not None:
            for package in missing.findall('package'):
                cls._parse_missing_package_node(package, project)

        return project


    @classmethod
    def get_project_details(cls, project):
        url = cls._get_info_url(project = project)
        root = cls._get_root_for_url(url, 'Cannot get information of project %s' % project, cache_file = project + '.xml')

        for node in root.findall('project'):
            item = cls._parse_project_node(node)
            if item is None or item.name != project:
                continue
            return item

        return None


    @classmethod
    def get_package_details(cls, projects, package):
        url = cls._get_info_url(projects = projects, package = package)
        root = cls._get_root_for_url(url, 'Cannot get information of package %s' % package)

        for node in root.findall('project'):
            item = cls._parse_project_node(node)
            if item is None or item.name not in projects:
                continue

            pkgitem = item[package]
            if pkgitem:
                return pkgitem

        return None


#######################################################################


class OscCollabCache:

    _cache_dir = None
    _ignore_cache = False
    _printed = False

    @classmethod
    def init(cls, ignore_cache):
        cls._ignore_cache = ignore_cache
        cls._cleanup_old_cache()


    @classmethod
    def _print_message(cls):
        if not cls._printed:
            cls._printed = True
            print('Downloading data in a cache. It might take a few seconds...')


    @classmethod
    def _get_xdg_cache_home(cls):
        dir = None
        if 'XDG_CACHE_HOME' in os.environ:
            dir = os.environ['XDG_CACHE_HOME']
            if dir == '':
                dir = None

        if not dir:
            dir = '~/.cache'

        return os.path.expanduser(dir)


    @classmethod
    def _get_xdg_cache_dir(cls):
        if not cls._cache_dir:
            cls._cache_dir = os.path.join(cls._get_xdg_cache_home(), 'osc', 'collab')

        return cls._cache_dir


    @classmethod
    def _cleanup_old_cache(cls):
        '''
            Remove old cache files, when they're old (and therefore obsolete
            anyway).
        '''
        gnome_cache_dir = os.path.join(cls._get_xdg_cache_home(), 'osc', 'gnome')
        if os.path.exists(gnome_cache_dir):
            shutil.rmtree(gnome_cache_dir)

        cache_dir = cls._get_xdg_cache_dir()
        if not os.path.exists(cache_dir):
            return

        for file in os.listdir(cache_dir):
            # remove if it's more than 5 days old
            if cls._need_update(file, 60 * 60 * 24 * 5):
                cache = os.path.join(cache_dir, file)
                os.unlink(cache)


    @classmethod
    def _need_update(cls, filename, maxage):
        if cls._ignore_cache:
            return True

        cache = os.path.join(cls._get_xdg_cache_dir(), filename)

        if not os.path.exists(cache):
            return True

        if not os.path.isfile(cache):
            return True

        stats = os.stat(cache)

        now = time.time()
        if now - stats.st_mtime > maxage:
            return True
        # Back to the future?
        elif now < stats.st_mtime:
            return True

        return False


    @classmethod
    def get_url_fd_with_cache(cls, url, filename, max_age_minutes):
        if cls._need_update(filename, max_age_minutes * 60):
            # no cache available
            cls._print_message()
            fd = urlopen(url)
            cls._write(filename, fin = fd)

        return open(os.path.join(cls._get_xdg_cache_dir(), filename))


    @classmethod
    def get_from_obs(cls, url, filename, max_age_minutes, what):
        cache = os.path.join(cls._get_xdg_cache_dir(), filename)

        if not cls._need_update(cache, max_age_minutes):
            return cache

        # no cache available
        cls._print_message()

        try:
            fin = http_GET(url)
        except HTTPError as e:
            print('Cannot get %s: %s' % (what, e.msg), file=sys.stderr)
            return None

        fout = open(cache, 'w')

        while True:
            try:
                bytes = fin.read(500 * 1024)
                if len(bytes) == 0:
                    break
                fout.write(bytes.decode('utf-8'))
            except HTTPError as e:
                fin.close()
                fout.close()
                os.unlink(cache)
                print('Error while downloading %s: %s' % (what, e.msg), file=sys.stderr)
                return None

        fin.close()
        fout.close()

        return cache


    @classmethod
    def _write(cls, filename, fin = None):
        if not fin:
            print('Internal error when saving a cache: no data.', file=sys.stderr)
            return False

        cachedir = cls._get_xdg_cache_dir()
        if not os.path.exists(cachedir):
            os.makedirs(cachedir)

        if not os.path.isdir(cachedir):
            print('Cache directory %s is not a directory.' % cachedir, file=sys.stderr)
            return False

        cache = os.path.join(cachedir, filename)
        if os.path.exists(cache):
            os.unlink(cache)
        fout = open(cache, 'w')

        if fin:
            while True:
                try:
                    bytes = fin.read(500 * 1024)
                    if len(bytes) == 0:
                        break
                    fout.write(bytes.decode('utf-8'))
                except HTTPError as e:
                    fout.close()
                    os.unlink(cache)
                    raise e
            fout.close()
            return True


#######################################################################


def _collab_is_program_in_path(program):
    if not 'PATH' in os.environ:
        return False

    for path in os.environ['PATH'].split(':'):
        if os.path.exists(os.path.join(path, program)):
            return True

    return False


#######################################################################


def _collab_find_request_to(package, requests):
    for request in requests:
        if request.target_package == package:
            return request
    return None


def _collab_has_request_from(package, requests):
    for request in requests:
        if request.source_package == package:
            return True
    return False


#######################################################################


def _collab_table_get_maxs(init, list):
    if len(list) == 0:
        return ()

    nb_maxs = len(init)
    maxs = []
    for i in range(nb_maxs):
        maxs.append(len(init[i]))

    for item in list:
        for i in range(nb_maxs):
            maxs[i] = max(maxs[i], len(item[i]))

    return tuple(maxs)


def _collab_table_get_template(*args):
    if len(args) == 0:
        return ''

    template = '%%-%d.%ds' % (args[0], args[0])
    index = 1

    while index < len(args):
        template = template + (' | %%-%d.%ds' % (args[index], args[index]))
        index = index + 1

    return template


def _collab_table_print_header(template, title):
    if len(title) == 0:
        return

    dash_template = template.replace(' | ', '-+-')

    very_long_dash = ('--------------------------------------------------------------------------------',)
    dashes = ()
    for i in range(len(title)):
        dashes = dashes + very_long_dash

    print(template % title)
    print(dash_template % dashes)


#######################################################################


def _collab_todo_internal(apiurl, all_reserved, all_commented, project, show_details, exclude_commented, exclude_reserved, exclude_submitted, exclude_devel):
    # get all versions of packages
    try:
        prj = OscCollabApi.get_project_details(project)
        prj.strip_internal_links()
    except OscCollabWebError as e:
        print(e.msg, file=sys.stderr)
        return (None, None)

    # get the list of reserved packages for this project
    reserved_packages = [ reservation.package for reservation in all_reserved if reservation.project == project ]

    # get the list of commented packages for this project
    firstline_comments = {}
    for comment in all_commented:
        if comment.project == project:
            firstline_comments[comment.package] = comment.firstline
    commented_packages = firstline_comments.keys()

    # get the packages submitted
    requests_to = OscCollabObs.get_request_list_to(project)

    parent_project = None
    packages = []

    for package in prj.values():
        if not package.needs_update() and package.name not in commented_packages:
            continue

        broken_link = package.is_broken_link()

        if package.parent_version or package.is_link:
            package.parent_version_print = package.parent_version or ''
        elif broken_link:
            # this can happen if the link is to a project that doesn't exist
            # anymore
            package.parent_version_print = '??'
        else:
            package.parent_version_print = '--'

        if package.version:
            package.version_print = package.version
        elif broken_link:
            package.version_print = '(broken)'
        else:
            package.version_print = '??'

        package.upstream_version_print = package.upstream_version or ''
        package.comment = ''

        if package.name in commented_packages:
            if exclude_commented:
                continue
            if not show_details:
                package.version_print += ' (c)'
                package.upstream_version_print += ' (c)'
            package.comment = firstline_comments[package.name]

        if not package.devel_needs_update():
            if exclude_devel:
                continue
            package.version_print += ' (d)'
            package.upstream_version_print += ' (d)'

        if _collab_find_request_to(package.name, requests_to) != None:
            if exclude_submitted:
                continue
            package.version_print += ' (s)'
            package.upstream_version_print += ' (s)'

        if package.name in reserved_packages:
            if exclude_reserved:
                continue
            package.upstream_version_print += ' (r)'

        package.upstream_version_print = package.upstream_version_print.strip()

        if package.parent_project:
            if parent_project == None:
                parent_project = package.parent_project
            elif parent_project != package.parent_project:
                parent_project = 'Parent Project'

        packages.append(package)


    return (parent_project, packages)


#######################################################################


def _collab_todo(apiurl, projects, show_details, ignore_comments, exclude_commented, exclude_reserved, exclude_submitted, exclude_devel):
    packages = []
    parent_project = None

    # get the list of reserved packages
    try:
        reserved = OscCollabApi.get_reserved_packages(projects)
    except OscCollabWebError as e:
        reserved = []
        print(e.msg, file=sys.stderr)

    # get the list of commented packages
    commented = []
    if not ignore_comments:
        try:
            commented = OscCollabApi.get_commented_packages(projects)
        except OscCollabWebError as e:
            print(e.msg, file=sys.stderr)

    for project in projects:
        (new_parent_project, project_packages) = _collab_todo_internal(apiurl, reserved, commented, project, show_details, exclude_commented, exclude_reserved, exclude_submitted, exclude_devel)
        if not project_packages:
            continue
        packages.extend(project_packages)

        if parent_project == None:
            parent_project = new_parent_project
        elif parent_project != new_parent_project:
            parent_project = 'Parent Project'

    if len(packages) == 0:
        print('Nothing to do.')
        return

    show_comments = not (ignore_comments or exclude_commented) and show_details

    if show_comments:
        lines = [ (package.name, package.parent_version_print, package.version_print, package.upstream_version_print, package.comment) for package in packages ]
    else:
        lines = [ (package.name, package.parent_version_print, package.version_print, package.upstream_version_print) for package in packages ]

    # the first element in the tuples is the package name, so it will sort
    # the lines the right way for what we want
    lines.sort()

    if len(projects) == 1:
        project_header = projects[0]
    else:
        project_header = "Devel Project"

    # print headers
    if show_comments:
        if parent_project:
            title = ('Package', parent_project, project_header, 'Upstream', 'Comment')
            (max_package, max_parent, max_devel, max_upstream, max_comment) = _collab_table_get_maxs(title, lines)
        else:
            title = ('Package', project_header, 'Upstream', 'Comment')
            (max_package, max_devel, max_upstream, max_comment) = _collab_table_get_maxs(title, lines)
            max_parent = 0
    else:
        if parent_project:
            title = ('Package', parent_project, project_header, 'Upstream')
            (max_package, max_parent, max_devel, max_upstream) = _collab_table_get_maxs(title, lines)
        else:
            title = ('Package', project_header, 'Upstream')
            (max_package, max_devel, max_upstream) = _collab_table_get_maxs(title, lines)
            max_parent = 0
        max_comment = 0

    # trim to a reasonable max
    max_package = min(max_package, 48)
    max_version = min(max(max(max_parent, max_devel), max_upstream), 20)
    max_comment = min(max_comment, 48)

    if show_comments:
        if parent_project:
            print_line = _collab_table_get_template(max_package, max_version, max_version, max_version, max_comment)
        else:
            print_line = _collab_table_get_template(max_package, max_version, max_version, max_comment)
    else:
        if parent_project:
            print_line = _collab_table_get_template(max_package, max_version, max_version, max_version)
        else:
            print_line = _collab_table_get_template(max_package, max_version, max_version)

    _collab_table_print_header(print_line, title)

    for line in lines:
        if not parent_project:
            if show_comments:
                (package, parent_version, devel_version, upstream_version, comment) = line
                line = (package, devel_version, upstream_version, comment)
            else:
                (package, parent_version, devel_version, upstream_version) = line
                line = (package, devel_version, upstream_version)
        print(print_line % line)


#######################################################################


def _collab_todoadmin_internal(apiurl, project, include_upstream):

    try:
        prj = OscCollabApi.get_project_details(project)
        prj.strip_internal_links()
    except OscCollabWebError as e:
        print(e.msg, file=sys.stderr)
        return []

    # get the packages submitted to/from
    requests_to = OscCollabObs.get_request_list_to(project)
    requests_from = OscCollabObs.get_request_list_from(project)

    lines = []

    for package in prj.values():
        message = None

        # We look for all possible messages. The last message overwrite the
        # first, so we start with the less important ones.

        if include_upstream:
            if not package.upstream_version:
                message = 'No upstream data available'
            elif not package.upstream_url:
                message = 'No URL for upstream tarball available'

        if package.has_delta:
            # FIXME: we should check the request is to the parent project
            if not _collab_has_request_from(package.name, requests_from):
                if not package.is_link:
                    message = 'Is not a link to %s and has a delta (synchronize the packages)' % package.project.parent
                elif not package.project.is_toplevel():
                    message = 'Needs to be submitted to %s' % package.parent_project
                else:
                    # packages in a toplevel project don't necessarily have to
                    # be submitted
                    message = 'Is a link with delta (maybe submit changes to %s)' % package.parent_project

        request = _collab_find_request_to(package.name, requests_to)
        if request is not None:
            message = 'Needs to be reviewed (request id: %s)' % request.req_id

        if package.error:
            if package.error == 'not-link':
                # if package has a delta, then we already set a message above
                if not package.has_delta:
                    message = 'Is not a link to %s (make link)' % package.project.parent
            elif package.error == 'not-link-not-in-parent':
                message = 'Is not a link, and is not in %s (maybe submit it)' % package.project.parent
            elif package.error == 'not-in-parent':
                message = 'Broken link: does not exist in %s' % package.parent_project
            elif package.error == 'need-merge-with-parent':
                message = 'Broken link: requires a manual merge with %s' % package.parent_project
            elif package.error == 'not-real-devel':
                message = 'Should not exist: %s' % package.error_details
            elif package.error == 'parent-without-devel':
                message = 'No devel project set for parent (%s/%s)' % (package.parent_project, package.parent_package)
            else:
                if package.error_details:
                    message = 'Unknown error (%s): %s' % (package.error, package.error_details)
                else:
                    message = 'Unknown error (%s)' % package.error

        if message:
            lines.append((project, package.name, message))


    for (package, parent_project, parent_package) in prj.missing_packages:
        message = 'Does not exist, but is devel package for %s/%s' % (parent_project, parent_package)
        lines.append((project, package, message))


    lines.sort()

    return lines


#######################################################################


def _collab_todoadmin(apiurl, projects, include_upstream):
    lines = []

    for project in projects:
        project_lines = _collab_todoadmin_internal(apiurl, project, include_upstream)
        lines.extend(project_lines)

    if len(lines) == 0:
        print('Nothing to do.')
        return

    # the first element in the tuples is the package name, so it will sort
    # the lines the right way for what we want
    lines.sort()

    # print headers
    title = ('Project', 'Package', 'Details')
    (max_project, max_package, max_details) = _collab_table_get_maxs(title, lines)
    # trim to a reasonable max
    max_project = min(max_project, 28)
    max_package = min(max_package, 48)
    max_details = min(max_details, 65)

    print_line = _collab_table_get_template(max_project, max_package, max_details)
    _collab_table_print_header(print_line, title)
    for line in lines:
        print(print_line % line)


#######################################################################


def _collab_listreserved(projects):
    try:
        reserved_packages = OscCollabApi.get_reserved_packages(projects)
    except OscCollabWebError as e:
        print(e.msg, file=sys.stderr)
        return

    if len(reserved_packages) == 0:
        print('No package reserved.')
        return

    # print headers
    # if changing the order here, then we need to change __getitem__ of
    # Reservation in the same way
    title = ('Project', 'Package', 'Reserved by')
    (max_project, max_package, max_username) = _collab_table_get_maxs(title, reserved_packages)
    # trim to a reasonable max
    max_project = min(max_project, 28)
    max_package = min(max_package, 48)
    max_username = min(max_username, 28)

    print_line = _collab_table_get_template(max_project, max_package, max_username)
    _collab_table_print_header(print_line, title)

    for reservation in reserved_packages:
        if reservation.user:
            print(print_line % (reservation.project, reservation.package, reservation.user))


#######################################################################


def _collab_isreserved(projects, packages, no_devel_project = False):
    for package in packages:
        try:
            reservation = OscCollabApi.is_package_reserved(projects, package, no_devel_project = no_devel_project)
        except OscCollabWebError as e:
            print(e.msg, file=sys.stderr)
            continue

        if not reservation:
            print('Package %s is not reserved.' % package)
        else:
            if reservation.project not in projects or reservation.package != package:
                print('Package %s in %s (devel package for %s) is reserved by %s.' % (reservation.package, reservation.project, package, reservation.user))
            else:
                print('Package %s in %s is reserved by %s.' % (package, reservation.project, reservation.user))


#######################################################################


def _collab_reserve(projects, packages, username, no_devel_project = False):
    for package in packages:
        try:
            reservation = OscCollabApi.reserve_package(projects, package, username, no_devel_project = no_devel_project)
        except OscCollabWebError as e:
            print(e.msg, file=sys.stderr)
            continue

        if reservation.project not in projects or reservation.package != package:
            print('Package %s in %s (devel package for %s) reserved for 36 hours.' % (reservation.package, reservation.project, package))
        else:
            print('Package %s reserved for 36 hours.' % package)
        print('Do not forget to unreserve the package when done with it:')
        print('    osc %s unreserve %s' % (_osc_collab_alias, package))


#######################################################################


def _collab_unreserve(projects, packages, username, no_devel_project = False):
    for package in packages:
        try:
            OscCollabApi.unreserve_package(projects, package, username, no_devel_project = no_devel_project)
        except OscCollabWebError as e:
            print(e.msg, file=sys.stderr)
            continue

        print('Package %s unreserved.' % package)


#######################################################################


def _collab_listcommented(projects):
    try:
        commented_packages = OscCollabApi.get_commented_packages(projects)
    except OscCollabWebError as e:
        print(e.msg, file=sys.stderr)
        return

    if len(commented_packages) == 0:
        print('No package commented.')
        return

    # print headers
    # if changing the order here, then we need to change __getitem__ of
    # Comment in the same way
    title = ('Project', 'Package', 'Commented by', 'Comment')
    (max_project, max_package, max_username, max_comment) = _collab_table_get_maxs(title, commented_packages)
    # trim to a reasonable max
    max_project = min(max_project, 28)
    max_package = min(max_package, 48)
    max_username = min(max_username, 28)
    max_comment = min(max_comment, 65)

    print_line = _collab_table_get_template(max_project, max_package, max_username, max_comment)
    _collab_table_print_header(print_line, title)

    for comment in commented_packages:
        if comment.user:
            print(print_line % (comment.project, comment.package, comment.user, comment.firstline))


#######################################################################


def _collab_comment(projects, packages, no_devel_project = False):
    for package in packages:
        try:
            comment = OscCollabApi.get_package_comment(projects, package, no_devel_project = no_devel_project)
        except OscCollabWebError as e:
            print(e.msg, file=sys.stderr)
            continue

        if not comment:
            print('Package %s is not commented.' % package)
        else:
            if comment.date:
                date_str = ' on %s' % comment.date
            else:
                date_str = ''

            if comment.project not in projects or comment.package != package:
                print('Package %s in %s (devel package for %s) is commented by %s%s:' % (comment.package, comment.project, package, comment.user, date_str))
                print(comment.indent())
            else:
                print('Package %s in %s is commented by %s%s:' % (package, comment.project, comment.user, date_str))
                print(comment.indent())


#######################################################################


def _collab_commentset(projects, package, username, comment, no_devel_project = False):
    try:
        comment = OscCollabApi.set_package_comment(projects, package, username, comment, no_devel_project = no_devel_project)
    except OscCollabWebError as e:
        print(e.msg, file=sys.stderr)
        return

    if comment.project not in projects or comment.package != package:
        print('Comment on package %s in %s (devel package for %s) set.' % (comment.package, comment.project, package))
    else:
        print('Comment on package %s set.' % package)
    print('Do not forget to unset comment on the package when done with it:')
    print('    osc %s commentunset %s' % (_osc_collab_alias, package))


#######################################################################


def _collab_commentunset(projects, packages, username, no_devel_project = False):
    for package in packages:
        try:
            OscCollabApi.unset_package_comment(projects, package, username, no_devel_project = no_devel_project)
        except OscCollabWebError as e:
            print(e.msg, file=sys.stderr)
            continue

        print('Comment on package %s unset.' % package)


#######################################################################


def _collab_setup_internal(apiurl, username, pkg, ignore_reserved = False, no_reserve = False, no_devel_project = False, no_branch = False):
    if not no_devel_project:
        initial_pkg = pkg
        while pkg.devel_project:
            previous_pkg = pkg
            try:
                pkg = OscCollabApi.get_package_details(pkg.devel_project, pkg.devel_package or pkg.name)
            except OscCollabWebError as e:
                pkg = None

            if not pkg:
                print('Cannot find information on %s/%s (development package for %s/%s). You can use --nodevelproject to ignore the development package.' % (previous_pkg.project.name, previous_pkg.name, initial_pkg.project.name, initial_pkg.name), file=sys.stderr)
                break

        if not pkg:
            return (False, None, None)

        if initial_pkg != pkg:
            print('Using development package %s/%s for %s/%s.' % (pkg.project.name, pkg.name, initial_pkg.project.name, initial_pkg.name))

    project = pkg.project.name
    package = pkg.name

    checkout_dir = package

    # Is it reserved? Note that we have already looked for the devel project,
    # so we force the project/package here.
    try:
        reservation = OscCollabApi.is_package_reserved((project,), package, no_devel_project = True)
        if reservation:
            reserved_by = reservation.user
        else:
            reserved_by = None
    except OscCollabWebError as e:
        print(e.msg, file=sys.stderr)
        return (False, None, None)

    # package already reserved, but not by us
    if reserved_by and reserved_by != username:
        if not ignore_reserved:
            print('Package %s is already reserved by %s.' % (package, reserved_by))
            return (False, None, None)
        else:
            print('WARNING: package %s is already reserved by %s.' % (package, reserved_by))
    # package not reserved
    elif not reserved_by and not no_reserve:
        try:
            # Note that we have already looked for the devel project, so we
            # force the project/package here.
            OscCollabApi.reserve_package((project,), package, username, no_devel_project = True)
            print('Package %s has been reserved for 36 hours.' % package)
            print('Do not forget to unreserve the package when done with it:')
            print('    osc %s unreserve %s' % (_osc_collab_alias, package))
        except OscCollabWebError as e:
            print(e.msg, file=sys.stderr)
            if not ignore_reserved:
                return (False, None, None)

    if not no_branch:
        # look if we already have a branch, and if not branch the package
        try:
            expected_branch_project = 'home:%s:branches:%s' % (username, project)
            show_package_meta(apiurl, expected_branch_project, package)
            branch_project = expected_branch_project
            branch_package = package
            # it worked, we already have the branch
        except HTTPError as e:
            if e.code != 404:
                print('Error while checking if package %s was already branched: %s' % (package, e.msg), file=sys.stderr)
                return (False, None, None)

            # We had a 404: it means the branched package doesn't exist yet
            (branch_project, branch_package) = OscCollabObs.branch_package(project, package, no_devel_project)
            if not branch_project or not branch_package:
                print('Error while branching package %s: incomplete reply from build service' % (package,), file=sys.stderr)
                return (False, None, None)

            if package != branch_package:
                print('Package %s has been branched in %s/%s.' % (package, branch_project, branch_package))
            else:
                print('Package %s has been branched in project %s.' % (branch_package, branch_project))
    else:
            branch_project = project
            branch_package = package

    checkout_dir = branch_package

    # check out the branched package
    if os.path.exists(checkout_dir):
        # maybe we already checked it out before?
        if not os.path.isdir(checkout_dir):
            print('File %s already exists but is not a directory.' % checkout_dir, file=sys.stderr)
            return (False, None, None)
        elif not is_package_dir(checkout_dir):
            print('Directory %s already exists but is not a checkout of a Build Service package.' % checkout_dir, file=sys.stderr)
            return (False, None, None)

        obs_package = filedir_to_pac(checkout_dir)
        if obs_package.name != branch_package or obs_package.prjname != branch_project:
            print('Directory %s already exists but is a checkout of package %s from project %s.' % (checkout_dir, obs_package.name, obs_package.prjname), file=sys.stderr)
            return (False, None, None)

        if _collab_osc_package_pending_commit(obs_package):
            print('Directory %s contains some uncommitted changes.' % (checkout_dir,), file=sys.stderr)
            return (False, None, None)

        # update the package
        try:
            # we specify the revision so that it gets expanded
            # the logic comes from do_update in commandline.py
            rev = None
            if obs_package.islink() and not obs_package.isexpanded():
                rev = obs_package.linkinfo.xsrcmd5
            elif obs_package.islink() and obs_package.isexpanded():
                rev = show_upstream_xsrcmd5(apiurl, branch_project, branch_package)

            obs_package.update(rev)
            print('Package %s has been updated.' % branch_package)
        except Exception as e:
            message = 'Error while updating package %s: ' % branch_package
            _collab_exception_print(e, message)
            return (False, None, None)

    else:
        # check out the branched package
        try:
            # disable package tracking: the current directory might not be a
            # project directory
            # Rationale: for new contributors, checking out in the current
            # directory is easier as it hides some complexity. However, this
            # results in possibly mixing packages from different projects,
            # which makes package tracking not work at all.
            old_tracking = conf.config['do_package_tracking']
            conf.config['do_package_tracking'] = _collab_get_config_bool(apiurl, 'collab_do_package_tracking', default = False)
            checkout_package(apiurl, branch_project, branch_package, expand_link=True)
            conf.config['do_package_tracking'] = old_tracking
            print('Package %s has been checked out.' % branch_package)
        except Exception as e:
            message = 'Error while checking out package %s: ' % branch_package
            _collab_exception_print(e, message)
            return (False, None, None)

    # remove old helper files
    for file in os.listdir(checkout_dir):
        for helper in _osc_collab_helpers:
            if file == helper:
                path = os.path.join(checkout_dir, file)
                os.unlink(path)
                break

    return (True, branch_project, branch_package)


#######################################################################


def _collab_get_package_with_valid_project(projects, package):
    try:
        pkg = OscCollabApi.get_package_details(projects, package)
    except OscCollabWebError as e:
        pkg = None

    if pkg is None or pkg.project is None or not pkg.project.name:
        print('Cannot find an appropriate project containing %s. You can use --project to override your project settings.' % package, file=sys.stderr)
        return None

    return pkg


#######################################################################


def _print_comment_after_setup(pkg, no_devel_project):
    comment = OscCollabApi.get_package_comment(pkg.project.name, pkg.name, no_devel_project = no_devel_project)
    if comment:
        if comment.date:
            date_str = ' on %s' % comment.date
        else:
            date_str = ''

        print('Note the comment from %s%s on this package:' % (comment.user, date_str))
        print(comment.indent())


#######################################################################


def _collab_setup(apiurl, username, projects, package, ignore_reserved = False, ignore_comment = False, no_reserve = False, no_devel_project = False, no_branch = False):
    pkg = _collab_get_package_with_valid_project(projects, package)
    if not pkg:
        return
    project = pkg.project.name

    (setup, branch_project, branch_package) = _collab_setup_internal(apiurl, username, pkg, ignore_reserved, no_reserve, no_devel_project, no_branch)
    if not setup:
        return
    print('Package %s has been prepared for work.' % branch_package)

    if not ignore_comment:
        _print_comment_after_setup(pkg, no_devel_project)


#######################################################################


def _collab_download_internal(url, dest_dir):
    if not os.path.exists(dest_dir):
        os.makedirs(dest_dir)

    parsed_url = urlparse(url)
    basename = os.path.basename(parsed_url.path)
    if not basename:
        # FIXME: workaround until we get a upstream_basename property for each
        # package (would be needed for upstream hosted on sf, anyway).
        # Currently needed for mkvtoolnix.
        for field in parsed_url.query.split('&'):
            try:
                (key, value) = field.split('=', 1)
            except ValueError:
                value = field
            if value.endswith('.gz') or value.endswith('.tgz') or value.endswith('.bz2'):
                basename = os.path.basename(value)

    if not basename:
        raise OscCollabDownloadError('Cannot download %s: no basename in URL.' % url)

    dest_file = os.path.join(dest_dir, basename)
    # we download the file again if it already exists. Maybe the upstream
    # tarball changed, eg. We could add an option to avoid this, but I feel
    # like it won't happen a lot anyway.
    if os.path.exists(dest_file):
        os.unlink(dest_file)

    try:
        fin = urlopen(url)
    except HTTPError as e:
        raise OscCollabDownloadError('Cannot download %s: %s' % (url, e.msg))

    fout = open(dest_file, 'wb')

    while True:
        try:
            bytes = fin.read(500 * 1024)
            if len(bytes) == 0:
                break
            fout.write(bytes)
        except HTTPError as e:
            fin.close()
            fout.close()
            os.unlink(dest_file)
            raise OscCollabDownloadError('Error while downloading %s: %s' % (url, e.msg))

    fin.close()
    fout.close()

    return dest_file


#######################################################################


def _collab_extract_diff_internal(directory, old_tarball, new_tarball):
    def _cleanup(old, new, tmpdir):
        if old:
            old.close()
        if new:
            new.close()
        shutil.rmtree(tmpdir)

    def _lzma_hack(filename, tmpdir):
        if not filename.endswith('.xz'):
            return filename

        dest = os.path.join(tmpdir, os.path.basename(filename))
        shutil.copyfile(filename, dest)
        subprocess.call(['xz', '-d', dest])

        return dest[:-3]

    # we need to make sure it's safe to extract the file
    # see the warning in http://www.python.org/doc/lib/tarfile-objects.html
    def _can_extract_with_trust(name):
        if not name:
            return False
        if name[0] == '/':
            return False
        if name[0] == '.':
            # only accept ./ if the first character is a dot
            if len(name) == 1 or name[1] != '/':
                return False

        return True

    def _extract_files(tar, path, whitelist):
        if not tar or not path or not whitelist:
            return

        for tarinfo in tar:
            if not _can_extract_with_trust(tarinfo.name):
                continue
            # we won't accept symlinks or hard links. It sounds weird to have
            # this for the files we're interested in.
            if not tarinfo.isfile():
                continue
            basename = os.path.basename(tarinfo.name)
            if not basename in whitelist:
                continue
            tar.extract(tarinfo, path)

    def _diff_files(old, new, dest):
        if not new:
            return (False, False)
        if not old:
            shutil.copyfile(new, dest)
            return (True, False)

        old_f = open(old)
        old_lines = old_f.readlines()
        old_f.close()
        new_f = open(new)
        new_lines = new_f.readlines()
        new_f.close()

        diff = difflib.unified_diff(old_lines, new_lines)
        # diff is a generator, so we can't know if it's empty or not until we
        # iterate over it. So if it's empty, we'll create an empty file, and
        # remove it afterwards.

        dest_f = open(dest, 'w')

        # We first write what we consider useful and then write the complete
        # diff for reference.
        # This works because a well-formed NEWS/ChangeLog will only have new
        # items added at the top, and therefore the useful diff is the addition
        # at the top. This doesn't work for configure.{ac,in}, but that's fine.
        # We need to cache the first lines, though, since diff is a generator
        # and we don't have direct access to lines.

        i = 0
        pass_one_done = False
        cached = []

        for line in diff:
            # we skip the first three lines of the diff
            if not pass_one_done and i == 0 and line[:3] == '---':
                cached.append(line)
                i = 1
            elif not pass_one_done and i == 1 and line[:3] == '+++':
                cached.append(line)
                i = 2
            elif not pass_one_done and i == 2 and line[:2] == '@@':
                cached.append(line)
                i = 3
            elif not pass_one_done and i == 3 and line[0] == '+':
                cached.append(line)
                dest_f.write(line[1:])
            elif not pass_one_done:
                # end of pass one: we write a note to help the user, and then
                # write the cache
                pass_one_done = True

                note = '# Note by osc %s: here is the complete diff for reference.' % _osc_collab_alias
                header = ''
                for i in range(len(note)):
                    header += '#'

                dest_f.write('\n')
                dest_f.write('%s\n' % header)
                dest_f.write('%s\n' % note)
                dest_f.write('%s\n' % header)
                dest_f.write('\n')
                for cached_line in cached:
                    dest_f.write(cached_line)
                dest_f.write(line)
            else:
                dest_f.write(line)

        dest_f.close()

        if not cached:
            os.unlink(dest)
            return (False, False)

        return (True, True)


    # FIXME: only needed until we switch to python >= 3.3
    lzma_hack = not hasattr(tarfile.TarFile, 'xzopen')

    tmpdir = tempfile.mkdtemp(prefix = 'osc-collab-')

    old = None
    new = None

    if old_tarball and os.path.exists(old_tarball):
        try:
            if lzma_hack:
                old_tarball = _lzma_hack(old_tarball, tmpdir)
            old = tarfile.open(old_tarball)
        except tarfile.TarError:
            pass
    else:
        # this is not fatal: we can provide the
        # NEWS/ChangeLog/configure.{ac,in} from the new tarball without a diff
        pass

    if new_tarball and os.path.exists(new_tarball):
        new_tarball_basename = os.path.basename(new_tarball)
        try:
            if lzma_hack:
                new_tarball = _lzma_hack(new_tarball, tmpdir)
            new = tarfile.open(new_tarball)
        except tarfile.TarError as e:
            _cleanup(old, new, tmpdir)
            raise OscCollabDiffError('Error when opening %s: %s' % (new_tarball_basename, e))
    else:
        _cleanup(old, new, tmpdir)
        raise OscCollabDiffError('Cannot extract useful diff between tarballs: no new tarball.')

    # make sure we have at least a subdirectory in tmpdir, since we'll extract
    # files from two tarballs that might conflict
    old_dir = os.path.join(tmpdir, 'old')
    new_dir = os.path.join(tmpdir, 'new')

    try:
        if old:
            err_tarball = os.path.basename(old_tarball)
            _extract_files (old, old_dir, ['NEWS', 'ChangeLog', 'configure.ac', 'configure.in', 'meson.build', 'meson_options.txt'])

        err_tarball = new_tarball_basename
        _extract_files (new, new_dir, ['NEWS', 'ChangeLog', 'configure.ac', 'configure.in', 'meson.build', 'meson_options.txt'])
    except (tarfile.ReadError, EOFError):
        _cleanup(old, new, tmpdir)
        raise OscCollabDiffError('Cannot extract useful diff between tarballs: %s is not a valid tarball.' % err_tarball)

    if old:
        old.close()
        old = None
    if new:
        new.close()
        new = None

    # find toplevel NEWS/ChangeLog/configure.{ac,in} in the new tarball
    if not os.path.exists(new_dir):
        _cleanup(old, new, tmpdir)
        raise OscCollabDiffError('Cannot extract useful diff between tarballs: no relevant files found in %s.' % new_tarball_basename)

    new_dir_files = os.listdir(new_dir)
    if len(new_dir_files) != 1:
        _cleanup(old, new, tmpdir)
        raise OscCollabDiffError('Cannot extract useful diff between tarballs: unexpected file hierarchy in %s.' % new_tarball_basename)

    new_subdir = os.path.join(new_dir, new_dir_files[0])
    if not os.path.isdir(new_subdir):
        _cleanup(old, new, tmpdir)
        raise OscCollabDiffError('Cannot extract useful diff between tarballs: unexpected file hierarchy in %s.' % new_tarball_basename)

    new_news = os.path.join(new_subdir, 'NEWS')
    if not os.path.exists(new_news) or not os.path.isfile(new_news):
        new_news = None
    new_changelog = os.path.join(new_subdir, 'ChangeLog')
    if not os.path.exists(new_changelog) or not os.path.isfile(new_changelog):
        new_changelog = None
    new_configure = os.path.join(new_subdir, 'configure.ac')
    if not os.path.exists(new_configure) or not os.path.isfile(new_configure):
        new_configure = os.path.join(new_subdir, 'configure.in')
        if not os.path.exists(new_configure) or not os.path.isfile(new_configure):
            new_configure = None
    new_meson = os.path.join(new_subdir, 'meson.build')
    if not os.path.exists(new_meson) or not os.path.isfile(new_meson):
        new_meson = None
    new_mesonopt = os.path.join(new_subdir, 'meson_options.txt')
    if not os.path.exists(new_mesonopt) or not os.path.isfile(new_mesonopt):
        new_mesonopt = None

    if not new_news and not new_changelog and not new_configure and not new_meson and not new_mesonopt:
        _cleanup(old, new, tmpdir)
        raise OscCollabDiffError('Cannot extract useful diff between tarballs: no relevant files found in %s.' % new_tarball_basename)

    # find toplevel NEWS/ChangeLog/configure.{ac,in} in the old tarball
    # not fatal
    old_news = None
    old_changelog = None
    old_configure = None
    old_meson = None
    old_mesonopt = None

    if os.path.exists(old_dir):
        old_dir_files = os.listdir(old_dir)
    else:
        old_dir_files = []

    if len(old_dir_files) == 1:
        old_subdir = os.path.join(old_dir, old_dir_files[0])
        if os.path.isdir(old_subdir):
            old_news = os.path.join(old_subdir, 'NEWS')
            if not os.path.exists(old_news) or not os.path.isfile(old_news):
                old_news = None
            old_changelog = os.path.join(old_subdir, 'ChangeLog')
            if not os.path.exists(old_changelog) or not os.path.isfile(old_changelog):
                old_changelog = None
            old_configure = os.path.join(old_subdir, 'configure.ac')
            if not os.path.exists(old_configure) or not os.path.isfile(old_configure):
                old_configure = os.path.join(old_subdir, 'configure.in')
                if not os.path.exists(old_configure) or not os.path.isfile(old_configure):
                    old_configure = None
            old_meson = os.path.join(old_subdir, 'meson.build')
            if not os.path.exists(old_meson) or not os.path.isfile(old_meson):
                old_meson = None
            old_mesonopt = os.path.join(old_subdir, 'meson_options.txt')
            if not os.path.exists(old_mesonopt) or not os.path.isfile(old_mesonopt):
                old_mesonopt = None

    # Choose the most appropriate prefix for helper files, based on the alias
    # that was used by the user
    helper_prefix = _osc_collab_helper_prefixes[0]
    for prefix in _osc_collab_helper_prefixes:
        if _osc_collab_alias in prefix:
            helper_prefix = prefix
            break

    # do the diff
    news = os.path.join(directory, helper_prefix + 'NEWS')
    (news_created, news_is_diff) = _diff_files(old_news, new_news, news)
    changelog = os.path.join(directory, helper_prefix + 'ChangeLog')
    (changelog_created, changelog_is_diff) = _diff_files(old_changelog, new_changelog, changelog)
    configure = os.path.join(directory, helper_prefix + 'configure')
    (configure_created, configure_is_diff) = _diff_files(old_configure, new_configure, configure)
    meson = os.path.join(directory, helper_prefix + 'meson')
    (meson_created, meson_is_diff) = _diff_files(old_meson, new_meson, meson)
    mesonopt = os.path.join(directory, helper_prefix + 'meson_options')
    (mesonopt_created, mesonopt_is_diff) = _diff_files(old_mesonopt, new_mesonopt, mesonopt)

    # Note: we make osc ignore those helper file we created by modifying
    # the exclude list of osc.core. See the top of this file.

    _cleanup(old, new, tmpdir)

    return (news, news_created, news_is_diff, changelog, changelog_created, changelog_is_diff, configure, configure_created, configure_is_diff, meson, meson_created, meson_is_diff, mesonopt, mesonopt_created, mesonopt_is_diff)


#######################################################################


def _collab_subst_defines(s, defines):
    '''Replace macros like %{version} and %{name} in strings. Useful
       for sources and patches '''
    for key in defines.keys():
        if s.find(key) != -1:
            value = defines[key]
            s = s.replace('%%{%s}' % key, value)
            s = s.replace('%%%s' % key, value)
    return s


def _collab_update_spec(spec_file, upstream_url, upstream_version):
    if not os.path.exists(spec_file):
        print('Cannot update %s: no such file.' % os.path.basename(spec_file), file=sys.stderr)
        return (False, None, None, False)
    elif not os.path.isfile(spec_file):
        print('Cannot update %s: not a regular file.' % os.path.basename(spec_file), file=sys.stderr)
        return (False, None, None, False)

    re_spec_header_with_version = re.compile('^(# spec file for package \S*) \(Version \S*\)(.*)', re.IGNORECASE)
    re_spec_define = re.compile('^%define\s+(\S*)\s+(\S*)', re.IGNORECASE)
    re_spec_name = re.compile('^Name:\s*(\S*)', re.IGNORECASE)
    re_spec_version = re.compile('^(Version:\s*)(\S*)', re.IGNORECASE)
    re_spec_release = re.compile('^(Release:\s*)\S*', re.IGNORECASE)
    re_spec_source = re.compile('^(Source0?:\s*)(\S*)', re.IGNORECASE)
    re_spec_prep = re.compile('^%prep', re.IGNORECASE)

    defines = {}
    old_source = None
    old_version = None
    define_in_source = False

    fin = open(spec_file, 'r')
    (fdout, tmp) = tempfile.mkstemp(dir = os.path.dirname(spec_file))

    # replace version and reset release
    while True:
        line = fin.readline()
        if len(line) == 0:
            break

        match = re_spec_prep.match(line)
        if match:
            os.write(fdout, line.encode('utf-8'))
            break

        match = re_spec_header_with_version.match(line)
        if match:
            # We drop the "(Version XYZ)" part of the header
            write_line = '%s%s\n' % (match.group(1), match.group(2))
            os.write(fdout, write_line.encode('utf-8'))
            continue

        match = re_spec_define.match(line)
        if match:
            defines[match.group(1)] = _collab_subst_defines(match.group(2), defines)
            os.write(fdout, line.encode('utf-8'))
            continue

        match = re_spec_name.match(line)
        if match:
            defines['name'] = match.group(1)
            os.write(fdout, line.encode('utf-8'))
            continue

        match = re_spec_version.match(line)
        if match:
            defines['version'] = match.group(2)
            old_version = _collab_subst_defines(match.group(2), defines)
            write_line = '%s%s\n' % (match.group(1), upstream_version)
            os.write(fdout, write_line.encode('utf-8'))
            continue

        match = re_spec_release.match(line)
        if match:
            write_line = '%s0\n' % match.group(1)
            os.write(fdout, write_line.encode('utf-8'))
            continue

        match = re_spec_source.match(line)
        if match:
            old_source = os.path.basename(match.group(2))

            if upstream_url:
                non_basename = os.path.dirname(upstream_url)
                new_source = os.path.basename(upstream_url)
                # Use _name in favor of name, as if _name exists, it's for a
                # good reason
                for key in [ '_name', 'name', 'version' ]:
                    if key in defines:
                        if key == 'version':
                            new_source = new_source.replace(upstream_version, '%%{%s}' % key)
                        else:
                            new_source = new_source.replace(defines[key], '%%{%s}' % key)

                # Only use the URL as source if the basename looks like the
                # real name of the file.
                # If '?' is in the basename, then we likely have some dynamic
                # page to download the file, which means the wrong basename.
                # For instance:
                # download.php?package=01&release=61&file=01&dummy=gwenhywfar-4.1.0.tar.gz
                if '?' not in new_source:
                    write_line = '%s%s/%s\n' % (match.group(1), non_basename, new_source)
                    os.write(fdout, write_line.encode('utf-8'))
                    continue

            os.write(fdout, line.encode('utf-8'))
            continue

        os.write(fdout, line.encode('utf-8'))

    # wild read/write to finish quickly
    while True:
        bytes = fin.read(10 * 1024)
        if len(bytes) == 0:
            break
        os.write(fdout, bytes.encode('utf-8'))

    fin.close()
    os.close(fdout)

    os.rename(tmp, spec_file)

    if old_source and old_source.find('%') != -1:
        for key in defines.keys():
            if old_source.find(key) != -1:
                old_source = old_source.replace('%%{%s}' % key, defines[key])
                old_source = old_source.replace('%%%s' % key, defines[key])
                if key not in [ 'name', '_name', 'version' ]:
                    define_in_source = True

    return (True, old_source, old_version, define_in_source)


#######################################################################


def _collab_update_changes(changes_file, upstream_version, realname, email):
    if not os.path.exists(changes_file):
        print('Cannot update %s: no such file.' % os.path.basename(changes_file), file=sys.stderr)
        return False
    elif not os.path.isfile(changes_file):
        print('Cannot update %s: not a regular file.' % os.path.basename(changes_file), file=sys.stderr)
        return False

    (fdout, tmp) = tempfile.mkstemp(dir = os.path.dirname(changes_file))

    old_lc_time = locale.setlocale(locale.LC_TIME)
    old_tz = os.getenv('TZ')
    locale.setlocale(locale.LC_TIME, 'C')
    os.putenv('TZ', 'UTC')
    time.tzset()

    os.write(fdout, b'-------------------------------------------------------------------\n')
    write_line = '%s - %s <%s>\n' % (time.strftime("%a %b %e %H:%M:%S %Z %Y"), realname, email)
    os.write(fdout, write_line.encode('utf-8'))
    os.write(fdout, b'\n')
    write_line = '- Update to version %s:\n' % upstream_version
    os.write(fdout, write_line.encode('utf-8'))
    os.write(fdout, b'  + \n')
    os.write(fdout, b'\n')

    locale.setlocale(locale.LC_TIME, old_lc_time)
    if old_tz:
        os.putenv('TZ', old_tz)
    else:
        os.unsetenv('TZ')
    time.tzset()

    fin = open(changes_file, 'r')
    while True:
        bytes = fin.read(10 * 1024)
        if len(bytes) == 0:
            break
        os.write(fdout, bytes.encode('utf-8'))
    fin.close()
    os.close(fdout)

    os.rename(tmp, changes_file)

    return True


#######################################################################


def _collab_quilt_package(spec_file):
    def _cleanup(null, tmpdir):
        null.close()
        shutil.rmtree(tmpdir)

    null = open('/dev/null', 'w')
    tmpdir = tempfile.mkdtemp(prefix = 'osc-collab-')


    # setup with quilt
    sourcedir = os.path.dirname(os.path.realpath(spec_file))
    popen = subprocess.Popen(['quilt', 'setup', '-d', tmpdir, '--sourcedir', sourcedir, spec_file], stdout = null, stderr = null)
    retval = popen.wait()

    if retval != 0:
        _cleanup(null, tmpdir)
        print('Cannot apply patches: \'quilt setup\' failed.', file=sys.stderr)
        return False


    # apply patches for all subdirectories
    for directory in os.listdir(tmpdir):
        dir = os.path.join(tmpdir, directory)

        if not os.path.isdir(dir):
            continue

        # there's no patch, so just continue
        if not os.path.exists(os.path.join(dir, 'patches')):
            continue

        popen = subprocess.Popen(['quilt', 'push', '-a', '-q'], cwd = dir, stdout = null, stderr = null)
        retval = popen.wait()

        if retval != 0:
            _cleanup(null, tmpdir)
            print('Cannot apply patches: \'quilt push -a\' failed.', file=sys.stderr)
            return False


    _cleanup(null, tmpdir)

    return True


#######################################################################


def _collab_update(apiurl, username, realname, email, projects, package, ignore_reserved = False, ignore_comment = False, no_reserve = False, no_devel_project = False, no_branch = False):
    if len(projects) == 1:
        project = projects[0]

        try:
            pkg = OscCollabApi.get_package_details(project, package)
        except OscCollabWebError as e:
            print(e.msg, file=sys.stderr)
            return
    else:
        pkg = _collab_get_package_with_valid_project(projects, package)
        if not pkg:
            return
        project = pkg.project.name

    # check that the project is up-to-date wrt parent project
    if pkg.parent_more_recent():
        print('Package %s is more recent in %s (%s) than in %s (%s). Please synchronize %s first.' % (package, pkg.parent_project, pkg.parent_version, project, pkg.version, project))
        return

    # check that an update is really needed
    if not pkg.upstream_version:
        print('No information about upstream version of package %s is available. Assuming it is not up-to-date.' % package)
    elif pkg.upstream_version == '--':
        print('Package %s has no upstream.' % package)
        return
    elif pkg.devel_project and pkg.needs_update() and not no_devel_project and not pkg.devel_needs_update():
        if not pkg.devel_package or pkg.devel_package == package:
            print('Package %s is already up-to-date in its development project (%s).' % (package, pkg.devel_project))
        else:
            print('Package %s is already up-to-date in its development project (%s/%s).' % (package, pkg.devel_project, pkg.devel_package))
        return
    elif not pkg.needs_update():
        print('Package %s is already up-to-date.' % package)
        return

    (setup, branch_project, branch_package) = _collab_setup_internal(apiurl, username, pkg, ignore_reserved, no_reserve, no_devel_project, no_branch)
    if not setup:
        return

    package_dir = branch_package

    # edit the version tag in the .spec files
    # not fatal if fails
    spec_file = os.path.join(package_dir, package + '.spec')
    if not os.path.exists(spec_file) and package != branch_package:
        spec_file = os.path.join(package_dir, branch_package + '.spec')
    (updated, old_tarball, old_version, define_in_source) = _collab_update_spec(spec_file, pkg.upstream_url, pkg.upstream_version)
    if old_tarball:
        old_tarball_with_dir = os.path.join(package_dir, old_tarball)
    else:
        old_tarball_with_dir = None

    if old_version and old_version == pkg.upstream_version:
        if no_branch:
            print('Package %s is already up-to-date (the database might not be up-to-date).' % branch_package)
        else:
            print('Package %s is already up-to-date (in your branch only, or the database is not up-to-date).' % branch_package)
        return

    if define_in_source:
        print('WARNING: the Source tag in %s is using some define that might not be valid anymore.' % spec_file)
    if updated:
        print('%s has been prepared.' % os.path.basename(spec_file))

    # warn if there are other spec files which might need an update
    for file in os.listdir(package_dir):
        if file.endswith('.spec') and file != os.path.basename(spec_file):
            print('WARNING: %s might need a manual update.' % file)


    # start adding an entry to .changes
    # not fatal if fails
    changes_file = os.path.join(package_dir, package + '.changes')
    if not os.path.exists(changes_file) and package != branch_package:
        changes_file = os.path.join(package_dir, branch_package + '.changes')
    if _collab_update_changes(changes_file, pkg.upstream_version, realname, email):
        print('%s has been prepared.' % os.path.basename(changes_file))

    # warn if there are other spec files which might need an update
    for file in os.listdir(package_dir):
        if file.endswith('.changes') and file != os.path.basename(changes_file):
            print('WARNING: %s might need a manual update.' % file)


    # download the upstream tarball
    # fatal if fails
    if not pkg.upstream_url:
        print('Cannot download latest upstream tarball for %s: no URL defined.' % package, file=sys.stderr)
        return

    print('Looking for the upstream tarball...')
    try:
        upstream_tarball = _collab_download_internal(pkg.upstream_url, package_dir)
    except OscCollabDownloadError as e:
        print(e.msg, file=sys.stderr)
        return

    if not upstream_tarball:
        print('No upstream tarball downloaded for %s.' % package, file=sys.stderr)
        return
    else:
        upstream_tarball_basename = os.path.basename(upstream_tarball)
        # same file as the old one: oops, we don't want to do anything weird
        # there
        if upstream_tarball_basename == old_tarball:
            old_tarball = None
            old_tarball_with_dir = None
        print('%s has been downloaded.' % upstream_tarball_basename)


    # check integrity of the downloaded file
    # fatal if fails (only if md5 exists)
    # TODO


    # extract NEWS & ChangeLog from the old + new tarballs, and do a diff
    # not fatal if fails
    print('Extracting useful diff between tarballs (NEWS, ChangeLog, configure.{ac,in})...')
    try:
        (news, news_created, news_is_diff, changelog, changelog_created, changelog_is_diff, configure, configure_created, configure_is_diff, meson, meson_created, meson_is_diff, mesonopt, mesonopt_created, mesonopt_is_diff) = _collab_extract_diff_internal(package_dir, old_tarball_with_dir, upstream_tarball)
    except OscCollabDiffError as e:
        print(e.msg, file=sys.stderr)
    else:
        if news_created:
            news_basename = os.path.basename(news)
            if news_is_diff:
                print('NEWS between %s and %s is available in %s' % (old_tarball, upstream_tarball_basename, news_basename))
            else:
                print('Complete NEWS of %s is available in %s' % (upstream_tarball_basename, news_basename))
        else:
            print('No NEWS information found.')

        if changelog_created:
            changelog_basename = os.path.basename(changelog)
            if changelog_is_diff:
                print('ChangeLog between %s and %s is available in %s' % (old_tarball, upstream_tarball_basename, changelog_basename))
            else:
                print('Complete ChangeLog of %s is available in %s' % (upstream_tarball_basename, changelog_basename))
        else:
            print('No ChangeLog information found.')

        if configure_created:
            configure_basename = os.path.basename(configure)
            if configure_is_diff:
                print('Diff in configure.{ac,in} between %s and %s is available in %s' % (old_tarball, upstream_tarball_basename, configure_basename))
            else:
                print('Complete configure.{ac,in} of %s is available in %s' % (upstream_tarball_basename, configure_basename))
        else:
            print('No configure.{ac,in} information found (tarball is probably not using autotools).')

        if meson_created:
            meson_basename = os.path.basename(meson)
            if meson_is_diff:
                print('Diff in meson.build between %s and %s is available in %s' % (old_tarball, upstream_tarball_basename, meson_basename))
            else:
                print('Complete meson.build of %s is available in %s' % (upstream_tarball_basename, meson_basename))
        else:
            print('No meson.build information found (tarball is probably not using meson).')

        if mesonopt_created:
            mesonopt_basename = os.path.basename(mesonopt)
            if mesonopt_is_diff:
                print('Diff in meson_options.txt between %s and %s is available in %s' % (old_tarball, upstream_tarball_basename, mesonopt_basename))
            else:
                print('Complete meson_options.txt of %s is available in %s' % (upstream_tarball_basename, mesonopt_basename))
        else:
            print('No meson_options.txt information found (tarball is probably not using meson or the build has no options).')


    # try applying the patches with rpm quilt
    # not fatal if fails
    if _collab_is_program_in_path('quilt'):
        print('Running quilt...')
        if _collab_quilt_package(spec_file):
            print('Patches still apply.')
        else:
            print('WARNING: make sure that all patches apply before submitting.')
    else:
        print('quilt is not available.')
        print('WARNING: make sure that all patches apply before submitting.')


    # 'osc add newfile.tar.bz2' and 'osc del oldfile.tar.bz2'
    # not fatal if fails
    osc_package = filedir_to_pac(package_dir)

    if old_tarball_with_dir:
        if os.path.exists(old_tarball_with_dir):
            osc_package.delete_file(old_tarball, force=True)
            print('%s has been removed from the package.' % old_tarball)
        else:
            print('WARNING: the previous tarball could not be found. Please manually remove it.')
    else:
        print('WARNING: the previous tarball could not be found. Please manually remove it.')

    osc_package.addfile(upstream_tarball_basename)
    print('%s has been added to the package.' % upstream_tarball_basename)


    print('Package %s has been prepared for the update.' % branch_package)
    print('After having updated %s, you can use \'osc build\' to start a local build or \'osc %s build\' to start a build on the build service.' % (os.path.basename(changes_file), _osc_collab_alias))

    if not ignore_comment:
        _print_comment_after_setup(pkg, no_devel_project)

    # TODO add a note about checking if patches are still needed, buildrequires
    # & requires


#######################################################################


def _collab_forward(apiurl, user, projects, request_id, no_supersede = False):
    try:
        int_request_id = int(request_id)
    except ValueError:
        print('%s is not a valid request id.' % (request_id), file=sys.stderr)
        return

    request = OscCollabObs.get_request(request_id)
    if request is None:
        return

    dest_package = request.target_package
    dest_project = request.target_project

    if dest_project not in projects:
        if len(projects) == 1:
            print('Submission request %s is for %s and not %s. You can use --project to override your project settings.' % (request_id, dest_project, projects[0]), file=sys.stderr)
        else:
            print('Submission request %s is for %s. You can use --project to override your project settings.' % (request_id, dest_project), file=sys.stderr)
        return

    if request.state != 'new':
        print('Submission request %s is not new.' % request_id, file=sys.stderr)
        return

    try:
        pkg = OscCollabApi.get_package_details((dest_project,), dest_package)
        if not pkg or not pkg.parent_project:
            print('No parent project for %s/%s.' % (dest_project, dest_package), file=sys.stderr)
            return
    except OscCollabWebError as e:
        print('Cannot get parent project of %s/%s.' % (dest_project, dest_package), file=sys.stderr)
        return

    try:
        devel_project = show_develproject(apiurl, pkg.parent_project, pkg.parent_package)
    except HTTPError as e:
        print('Cannot get development project for %s/%s: %s' % (pkg.parent_project, pkg.parent_package, e.msg), file=sys.stderr)
        return

    if devel_project != dest_project:
        print('Development project for %s/%s is %s, but package has been submitted to %s.' % (pkg.parent_project, pkg.parent_package, devel_project, dest_project), file=sys.stderr)
        return

    if not OscCollabObs.change_request_state(request_id, 'accepted', 'Forwarding to %s' % pkg.parent_project):
        return

    # TODO: cancel old requests from request.dst_project to parent project

    result = create_submit_request(apiurl,
                                   dest_project, dest_package,
                                   pkg.parent_project, pkg.parent_package,
                                   request.description + ' (forwarded request %s from %s)' % (request_id, request.by))

    print('Submission request %s has been forwarded to %s (request id: %s).' % (request_id, pkg.parent_project, result))

    if not no_supersede:
        for old_id in OscCollabObs.supersede_old_requests(user, pkg.parent_project, pkg.parent_package, result):
            print('Previous submission request %s has been superseded.' % old_id)


#######################################################################


def _collab_osc_package_pending_commit(osc_package):
    # ideally, we could use osc_package.todo, but it's not set by default.
    # So we just look at all files.
    for filename in osc_package.filenamelist + osc_package.filenamelist_unvers:
        status = osc_package.status(filename)
        if status in ['A', 'M', 'D']:
            return True

    return False


def _collab_osc_package_commit(osc_package, msg):
    osc_package.commit(msg)
    # See bug #436932: Package.commit() leads to outdated internal data.
    osc_package.update_datastructs()


#######################################################################


def _collab_package_set_meta(apiurl, project, package, meta, error_msg_prefix = ''):
    if error_msg_prefix:
        error_str = error_msg_prefix + ': %s'
    else:
        error_str = 'Cannot set metadata for %s in %s: %%s' % (package, project)

    (fdout, tmp) = tempfile.mkstemp()
    os.write(fdout, meta)
    os.close(fdout)

    meta_url = make_meta_url('pkg', (quote_plus(project), quote_plus(package)), apiurl)

    failed = False
    try:
        http_PUT(meta_url, file=tmp)
    except HTTPError as e:
        print(error_str % e.msg, file=sys.stderr)
        failed = True

    os.unlink(tmp)
    return not failed


def _collab_enable_build(apiurl, project, package, meta, repos, archs):
    if len(archs) == 0:
        return (True, False)

    package_node = ET.XML(meta)
    meta_xml = ET.ElementTree(package_node)

    build_node = package_node.find('build')
    if not build_node:
        build_node = ET.Element('build')
        package_node.append(build_node)

    enable_found = {}
    for repo in repos:
        enable_found[repo] = {}
        for arch in archs:
            enable_found[repo][arch] = False

    # remove disable before adding enable
    for node in build_node.findall('disable'):
        repo = node.get('repository')
        arch = node.get('arch')

        if repo and repo not in repos:
            continue
        if arch and arch not in archs:
            continue

        build_node.remove(node)

    for node in build_node.findall('enable'):
        repo = node.get('repository')
        arch = node.get('arch')

        if repo and repo not in repos:
            continue
        if arch and arch not in archs:
            continue

        if repo and arch:
            enable_found[repo][arch] = True
        elif repo:
            for arch in enable_found[repo].keys():
                enable_found[repo][arch] = True
        elif arch:
            for repo in enable_found.keys():
                enable_found[repo][arch] = True
        else:
            for repo in enable_found.keys():
                for arch in enable_found[repo].keys():
                    enable_found[repo][arch] = True

    for repo in repos:
        for arch in archs:
            if not enable_found[repo][arch]:
                node = ET.Element('enable', { 'repository': repo, 'arch': arch})
                build_node.append(node)

    all_true = True
    for repo in enable_found.keys():
        for value in enable_found[repo].values():
            if not value:
                all_true = False
                break

    if all_true:
        return (True, False)

    meta = ET.tostring(package_node)

    if _collab_package_set_meta(apiurl, project, package, meta, 'Error while enabling build of package on the build service'):
        return (True, True)
    else:
        return (False, False)


def _collab_get_latest_package_rev_built(apiurl, project, repo, arch, package, verbose_error = True):
    url = makeurl(apiurl, ['build', project, repo, arch, package, '_history'])

    try:
        history = http_GET(url)
    except HTTPError as e:
        if verbose_error:
            print('Cannot get build history: %s' % e.msg, file=sys.stderr)
        return (False, None, None)

    try:
        root = ET.parse(history).getroot()
    except SyntaxError:
        history.close ()
        return (False, None, None)

    max_time = 0
    rev = None
    srcmd5 = None

    for node in root.findall('entry'):
        t = int(node.get('time'))
        if t <= max_time:
            continue

        srcmd5 = node.get('srcmd5')
        rev = node.get('rev')

    history.close ()

    return (True, srcmd5, rev)


def _collab_print_build_status(build_state, header, error_line, hint = False):
    def get_str_repo_arch(repo, arch, show_repos):
        if show_repos:
            return '%s/%s' % (repo, arch)
        else:
            return arch

    print('%s:' % header)

    repos = build_state.keys()
    if not repos or len(repos) == 0:
        print('  %s' % error_line)
        return

    repos_archs = []

    for repo in repos:
        archs = build_state[repo].keys()
        for arch in archs:
            repos_archs.append((repo, arch))
            one_result = True

    if len(repos_archs) == 0:
        print('  %s' % error_line)
        return

    show_hint = False
    show_repos = len(repos) > 1
    repos_archs.sort()

    max_len = 0
    for (repo, arch) in repos_archs:
        l = len(get_str_repo_arch(repo, arch, show_repos))
        if l > max_len:
            max_len = l

    # 4: because we also have a few other characters (see left variable)
    format = '%-' + str(max_len + 4) + 's%s'
    for (repo, arch) in repos_archs:
        if not build_state[repo][arch]['scheduler'] and build_state[repo][arch]['result'] in ['failed']:
            show_hint = True

        left = '  %s: ' % get_str_repo_arch(repo, arch, show_repos)
        if build_state[repo][arch]['result'] in ['unresolved', 'broken', 'blocked', 'finished', 'signing'] and build_state[repo][arch]['details']:
            status = '%s (%s)' % (build_state[repo][arch]['result'], build_state[repo][arch]['details'])
        else:
            status = build_state[repo][arch]['result']

        if build_state[repo][arch]['scheduler']:
            status = '%s (was: %s)' % (build_state[repo][arch]['scheduler'], status)

        print(format % (left, status))

    if show_hint and hint:
        for (repo, arch) in repos_archs:
            if build_state[repo][arch]['result'] == 'failed':
                print('You can see the log of the failed build with: osc buildlog %s %s' % (repo, arch))


def _collab_build_get_results(apiurl, project, repos, package, archs, srcmd5, rev, state, error_counter, verbose_error):
    try:
        results = show_results_meta(apiurl, project, package=package)
        if len(results) == 0:
            if verbose_error:
                print('Error while getting build results of package on the build service: empty results', file=sys.stderr)
            error_counter += 1
            return (True, False, error_counter, state)

        # reset the error counter
        error_counter = 0
    except (HTTPError, BadStatusLine) as e:
        if verbose_error:
            print('Error while getting build results of package on the build service: %s' % e, file=sys.stderr)
        error_counter += 1
        return (True, False, error_counter, state)

    res_root = ET.XML(b''.join(results))
    detailed_results = {}
    repos_archs = []

    for node in res_root.findall('result'):

        repo = node.get('repository')
        # ignore the repo if it's not one we explicitly use
        if not repo in repos:
           continue

        arch = node.get('arch')
        # ignore the archs we didn't explicitly enabled: this ensures we care
        # only about what is really important to us
        if not arch in archs:
            continue

        scheduler_state = node.get('state')
        scheduler_dirty = node.get('dirty') == 'true'

        status_node = node.find('status')
        try:
            status = status_node.get('code')
        except:
            # code can be missing when package is too new:
            status = 'unknown'

        try:
            details = status_node.find('details').text
        except:
            details = None

        if not repo in detailed_results:
            detailed_results[repo] = {}
        detailed_results[repo][arch] = {}
        detailed_results[repo][arch]['status'] = status
        detailed_results[repo][arch]['details'] = details
        detailed_results[repo][arch]['scheduler_state'] = scheduler_state
        detailed_results[repo][arch]['scheduler_dirty'] = scheduler_dirty
        repos_archs.append((repo, arch))

    # evaluate the status: do we need to give more time to the build service?
    # Was the build successful?
    bs_not_ready = False
    do_not_wait_for_bs = False
    build_successful = True

    # A bit paranoid, but it seems it happened to me once...
    if len(repos_archs) == 0:
        bs_not_ready = True
        build_successful = False
        if verbose_error:
            print('Build service did not return any information.', file=sys.stderr)
        error_counter += 1

    for (repo, arch) in repos_archs:
        # first look at the state of the scheduler
        scheduler_state = detailed_results[repo][arch]['scheduler_state']
        scheduler_dirty = detailed_results[repo][arch]['scheduler_dirty']
        if detailed_results[repo][arch]['scheduler_dirty']:
            scheduler_active = 'waiting for scheduler'
        elif scheduler_state in ['unknown', 'scheduling']:
            scheduler_active = 'waiting for scheduler'
        elif scheduler_state in ['blocked']:
            scheduler_active = 'blocked by scheduler'
        else:
            # we don't care about the scheduler state
            scheduler_active = ''

        need_rebuild = False
        value = detailed_results[repo][arch]['status']

        # the scheduler is working, or the result has changed since last time,
        # so we won't trigger a rebuild
        if scheduler_active or state[repo][arch]['result'] != value:
            state[repo][arch]['rebuild'] = -1

        # build is done, but not successful
        if scheduler_active or value not in ['succeeded', 'excluded']:
            build_successful = False

        # we just ignore the status if the scheduler is active, since we know
        # we need to wait for the build service
        if scheduler_active:
            bs_not_ready = True

        # build is happening or will happen soon
        elif value in ['scheduled', 'building', 'dispatching', 'finished', 'signing']:
            bs_not_ready = True

        # sometimes, the scheduler forgets about a package in 'blocked' state,
        # so we have to force a rebuild
        elif value in ['blocked']:
            bs_not_ready = True
            need_rebuild = True

        # build has failed for an architecture: no need to wait for other
        # architectures to know that there's a problem
        elif value in ['failed', 'unresolved', 'broken']:
            do_not_wait_for_bs = True

        # 'disabled' => the build service didn't take into account
        # the change we did to the meta yet (eg).
        elif value in ['unknown', 'disabled']:
            bs_not_ready = True
            need_rebuild = True

        # build is done, but is it for the latest version?
        elif value in ['succeeded']:
            # check that the build is for the version we have
            (success, built_srcmd5, built_rev) = _collab_get_latest_package_rev_built(apiurl, project, repo, arch, package, verbose_error)

            if not success:
                detailed_results[repo][arch]['status'] = 'succeeded, but maybe not up-to-date'
                error_counter += 1
                # we don't know what's going on, so we'll contact the build
                # service again
                bs_not_ready = True
            else:
                # reset the error counter
                error_counter = 0

                #FIXME: "revision" seems to not have the same meaning for the
                # build history and for the local package. See bug #436781
                # (bnc). So, we just ignore the revision for now.
                #if (built_srcmd5, built_rev) != (srcmd5, rev):
                if built_srcmd5 != srcmd5:
                    need_rebuild = True
                    detailed_results[repo][arch]['status'] = 'rebuild needed'

        if not scheduler_active and need_rebuild and state[repo][arch]['rebuild'] == 0:
            bs_not_ready = True

            print('Triggering rebuild for %s as of %s' % (arch, time.strftime('%X (%x)', time.localtime())))

            try:
                rebuild(apiurl, project, package, repo, arch)
                # reset the error counter
                error_counter = 0
            except (HTTPError, BadStatusLine) as e:
                if verbose_error:
                    print('Cannot trigger rebuild for %s: %s' % (arch, e), file=sys.stderr)
                error_counter += 1

        state[repo][arch]['scheduler'] = scheduler_active
        state[repo][arch]['result'] = detailed_results[repo][arch]['status']
        state[repo][arch]['details'] = detailed_results[repo][arch]['details']

        # Update the timeout data
        if scheduler_active:
            pass
        if state[repo][arch]['result'] in ['blocked']:
            # if we're blocked, maybe the scheduler forgot about us, so
            # schedule a rebuild every 60 minutes. The main use case is when
            # you leave the plugin running for a whole night.
            if state[repo][arch]['rebuild'] <= 0:
                state[repo][arch]['rebuild-timeout'] = 60
                state[repo][arch]['rebuild'] = state[repo][arch]['rebuild-timeout']

            # note: it's correct to decrement even if we start with a new value
            # of timeout, since if we don't, it adds 1 minute (ie, 5 minutes
            # instead of 4, eg)
            state[repo][arch]['rebuild'] = state[repo][arch]['rebuild'] - 1
        elif state[repo][arch]['result'] in ['unknown', 'disabled', 'rebuild needed']:
            # if we're in this unexpected state, force the scheduler to do
            # something
            if state[repo][arch]['rebuild'] <= 0:
                # we do some exponential timeout until 60 minutes. We skip
                # timeouts of 1 and 2 minutes since they're quite short.
                if state[repo][arch]['rebuild-timeout'] > 0:
                    state[repo][arch]['rebuild-timeout'] = min(60, state[repo][arch]['rebuild-timeout'] * 2)
                else:
                    state[repo][arch]['rebuild-timeout'] = 4
                state[repo][arch]['rebuild'] = state[repo][arch]['rebuild-timeout']

            # note: it's correct to decrement even if we start with a new value
            # of timeout, since if we don't, it adds 1 minute (ie, 5 minutes
            # instead of 4, eg)
            state[repo][arch]['rebuild'] = state[repo][arch]['rebuild'] - 1
        else:
            # else, we make sure we won't manually trigger a rebuild
            state[repo][arch]['rebuild'] = -1
            state[repo][arch]['rebuild-timeout'] = -1

    if do_not_wait_for_bs:
        bs_not_ready = False

    return (bs_not_ready, build_successful, error_counter, state)


def _collab_build_wait_loop(apiurl, project, repos, package, archs, srcmd5, rev):
    # seconds we wait before looking at the results on the build service
    check_frequency = 60
    max_errors = 10

    build_successful = False
    print_status = False
    error_counter = 0
    last_check = 0

    state = {}
    # When we want to trigger a rebuild for this repo/arch.
    # The initial value is 1 since we don't want to trigger a rebuild the first
    # time when the state is 'disabled' since the state might have changed very
    # recently (if we updated the metadata ourselves), and the build service
    # might have an old build that it can re-use instead of building again.
    for repo in repos:
        state[repo] = {}
        for arch in archs:
            state[repo][arch] = {}
            state[repo][arch]['rebuild'] = -1
            state[repo][arch]['rebuild-timeout'] = -1
            state[repo][arch]['scheduler'] = ''
            state[repo][arch]['result'] = 'unknown'
            state[repo][arch]['details'] = ''

    print("Building on %s..." % ', '.join(repos))
    print("You can press enter to get the current status of the build.")

    # It's important to start the loop by downloading results since we might
    # already have successful builds, and we don't want to wait to know that.

    try:

        while True:
            # get build status if we don't have a recent status
            now = time.time()
            if now - last_check >= 58:
                # 58s since sleep() is not 100% precise and we don't want to miss
                # one turn
                last_check = now

                (need_to_continue, build_successful, error_counter, state) = _collab_build_get_results(apiurl, project, repos, package, archs, srcmd5, rev, state, error_counter, print_status)

                # just stop if there are too many errors
                if error_counter > max_errors:
                    print('Giving up: too many consecutive errors when contacting the build service.', file=sys.stderr)
                    break

            else:
                # we didn't download the results, so we want to continue anyway
                need_to_continue = True

            if print_status:
                header = 'Status as of %s [checking the status every %d seconds]' % (time.strftime('%X (%x)', time.localtime(last_check)), check_frequency)
                _collab_print_build_status(state, header, 'no results returned by the build service')

            if not need_to_continue:
                break


            # and now wait for input/timeout
            print_status = False

            # wait check_frequency seconds or for user input
            now = time.time()
            if now - last_check < check_frequency:
                wait = check_frequency - (now - last_check)
            else:
                wait = check_frequency

            res = select.select([sys.stdin], [], [], wait)

            # we have user input
            if len(res[0]) > 0:
                print_status = True
                # empty sys.stdin
                sys.stdin.readline()


    # we catch this exception here since we might need to revert some metadata
    except KeyboardInterrupt:
        print('')
        print('Interrupted: not waiting for the build to finish. Cleaning up...')

    return (build_successful, state)


#######################################################################


def _collab_autodetect_repo(apiurl, project):
    try:
        meta_lines = show_project_meta(apiurl, project)
        meta = b''.join(meta_lines)
    except HTTPError:
        return None

    try:
        root = ET.XML(meta)
    except SyntaxError:
        return None

    repos = []
    for node in root.findall('repository'):
        name = node.get('name')
        if name:
            repos.append(name)

    if not repos:
        return None

    # This is the list of repositories we prefer, the first one being the
    # preferred one.
    #  + snapshot/standard is what openSUSE:Factory uses, and some other
    #    projects might use this too (snapshot is like standard, except that
    #    packages won't block before getting built).
    #  + openSUSE_Factory is the usual repository for devel projects.
    #  + openSUSE-Factory is a variant of the one above (typo in some project
    #    config?)
    for repo in [ 'snapshot', 'standard', 'openSUSE_Factory', 'openSUSE-Factory' ]:
        if repo in repos:
            return repo

    # No known repository? We try to use the last one named openSUSE* (last
    # one because we want the most recent version of openSUSE).
    opensuse_repos = [ repo for repo in repos if repo.startswith('openSUSE') ]
    if len(opensuse_repos) > 0:
        opensuse_repos.sort(reverse = True)
        return opensuse_repos[0]

    # Still no solution? Let's just take the first one...
    return repos[0]


#######################################################################


def _collab_build_internal(apiurl, osc_package, repos, archs):
    project = osc_package.prjname
    package = osc_package.name

    if '!autodetect!' in repos:
        print('Autodetecting the most appropriate repository for the build...')
        repos.remove('!autodetect!')
        repo = _collab_autodetect_repo(apiurl, project)
        if repo:
            repos.append(repo)

    if len(repos) == 0:
        print('Error while setting up the build: no usable repository.', file=sys.stderr)
        return False

    repos.sort()
    archs.sort()

    # check that build is enabled for this package in this project, and if this
    # is not the case, enable it
    try:
        meta_lines = show_package_meta(apiurl, project, package)
    except HTTPError as e:
        print('Error while checking if package is set to build: %s' % e.msg, file=sys.stderr)
        return False

    meta = b''.join(meta_lines)
    (success, changed_meta) = _collab_enable_build(apiurl, project, package, meta, repos, archs)
    if not success:
        return False

    # loop to periodically check the status of the build (and eventually
    # trigger rebuilds if necessary)
    (build_success, build_state) = _collab_build_wait_loop(apiurl, project, repos, package, archs, osc_package.srcmd5, osc_package.rev)

    if not build_success:
        _collab_print_build_status(build_state, 'Status', 'no status known: osc got interrupted?', hint=True)

    # disable build for package in this project if we manually enabled it
    # (we just reset to the old settings)
    if changed_meta:
        _collab_package_set_meta(apiurl, project, package, meta, 'Error while resetting build settings of package on the build service')

    return build_success


#######################################################################


def _collab_build(apiurl, user, projects, msg, repos, archs):
    try:
        osc_package = filedir_to_pac('.')
    except oscerr.NoWorkingCopy as e:
        print(e, file=sys.stderr)
        return

    project = osc_package.prjname
    package = osc_package.name

    committed = False

    # commit if there are local changes
    if _collab_osc_package_pending_commit(osc_package):
        if not msg:
            msg = edit_message()
        _collab_osc_package_commit(osc_package, msg)
        committed = True

    build_success = _collab_build_internal(apiurl, osc_package, repos, archs)

    if build_success:
        print('Package successfully built on the build service.')


#######################################################################


def _collab_build_submit(apiurl, user, projects, msg, repos, archs, forward = False, no_unreserve = False, no_supersede = False):
    try:
        osc_package = filedir_to_pac('.')
    except oscerr.NoWorkingCopy as e:
        print(e, file=sys.stderr)
        return

    project = osc_package.prjname
    package = osc_package.name

    # do some preliminary checks on the package/project: it has to be
    # a branch of a development project
    if not osc_package.islink():
        print('Package is not a link.', file=sys.stderr)
        return

    parent_project = osc_package.linkinfo.project
    if not parent_project in projects:
        if len(projects) == 1:
            print('Package links to project %s and not %s. You can use --project to override your project settings.' % (parent_project, projects[0]), file=sys.stderr)
        else:
            print('Package links to project %s. You can use --project to override your project settings.' % parent_project, file=sys.stderr)
        return

    if not project.startswith('home:%s:branches' % user):
        print('Package belongs to project %s which does not look like a branch project.' % project, file=sys.stderr)
        return

    if project != 'home:%s:branches:%s' % (user, parent_project):
        print('Package belongs to project %s which does not look like a branch project for %s.' % (project, parent_project), file=sys.stderr)
        return


    # get the message that will be used for commit & request
    if not msg:
        msg = edit_message(footer='This message will be used for the commit (if necessary) and the request.\n')

    committed = False

    # commit if there are local changes
    if _collab_osc_package_pending_commit(osc_package):
        _collab_osc_package_commit(osc_package, msg)
        committed = True

    build_success = _collab_build_internal(apiurl, osc_package, repos, archs)

    # if build successful, submit
    if build_success:
        result = create_submit_request(apiurl,
                                       project, package,
                                       parent_project, package,
                                       msg)

        print('Package submitted to %s (request id: %s).' % (parent_project, result))

        if not no_supersede:
            for old_id in OscCollabObs.supersede_old_requests(user, parent_project, package, result):
                print('Previous submission request %s has been superseded.' % old_id)

        if forward:
            # we volunteerly restrict the project list to parent_project for
            # self-consistency and more safety
            _collab_forward(apiurl, user, [ parent_project ], result, no_supersede = no_supersede)

        if not no_unreserve:
            try:
                reservation = OscCollabApi.is_package_reserved((parent_project,), package, no_devel_project = True)
                if reservation and reservation.user == user:
                    _collab_unreserve((parent_project,), (package,), user, no_devel_project = True)
            except OscCollabWebError as e:
                print(e.msg, file=sys.stderr)
    else:
        print('Package was not submitted to %s' % parent_project)


#######################################################################


# TODO
# Add a commit method.
# This will make some additional checks:
#   + if we used update, we can initialize a list of patches/sources
#     before any change. This way, on the commit, we can look if the
#     remaining files are still referenced in the .spec, and if not
#     complain if the file hasn't been removed from the directory.
#     We can also complain if a file hasn't been added with osc add,
#     while it's referenced.
#   + complain if the lines in .changes are too long


#######################################################################


# Unfortunately, as of Python 2.5, ConfigParser does not know how to
# preserve a config file: it removes comments and reorders stuff.
# This is a dumb function to append a value to a section in a config file.
def _collab_add_config_option(section, key, value):
    global _osc_collab_osc_conffile

    conffile = _osc_collab_osc_conffile

    if not os.path.exists(conffile):
        lines = [ ]
    else:
        fin = open(conffile, 'r')
        lines = fin.readlines()
        fin.close()

    (fdout, tmp) = tempfile.mkstemp(prefix = os.path.basename(conffile), dir = os.path.dirname(conffile))

    in_section = False
    added = False
    empty_line = False

    valid_sections = [ '[' + section + ']' ]
    if section.startswith('http'):
        if section.endswith('/'):
            valid_sections.append('[' + section[:-1] + ']')
        else:
            valid_sections.append('[' + section + '/]')

    for line in lines:
        if line.rstrip() in valid_sections:
            in_section = True
        # key was not in the section: let's add it
        elif line[0] == '[' and in_section and not added:
            if not empty_line:
                os.write(fdout, b'\n')
            write_line = '%s = %s\n\n' % (key, value)
            os.write(fdout, write_line.encode('utf-8'))
            added = True
            in_section = False
        elif line[0] == '[' and in_section:
            in_section = False
        # the section/key already exists: we replace
        # 'not added': in case there are multiple sections with the same name
        elif in_section and not added and line.startswith(key):
            index = line.find('=')
            if line[:index].rstrip() == key:
                line = '%s= %s\n' % (line[:index], value)
                added = True

        os.write(fdout, line.encode('utf-8'))

        empty_line = line.strip() == ''

    if not added:
        if not empty_line:
            os.write(fdout, b'\n')
        if not in_section:
            write_line = '[%s]\n' % (section,)
            os.write(fdout, write_line.encode('utf-8'))
        write_line = '%s = %s\n' % (key, value)
        os.write(fdout, write_line.encode('utf-8'))

    os.close(fdout)
    os.rename(tmp, conffile)


#######################################################################


def _collab_get_compatible_apiurl_for_config(config, apiurl):
    if apiurl is None:
        return None

    if config.has_section(apiurl):
        return apiurl

    # first try adding/removing a trailing slash to the API url
    if apiurl.endswith('/'):
        apiurl = apiurl[:-1]
    else:
        apiurl = apiurl + '/'

    if config.has_section(apiurl):
        return apiurl

    # old osc (0.110) was adding the host to the tuple without the http
    # part, ie just the host
    apiurl = urlparse(apiurl).netloc

    if apiurl and config.has_section(apiurl):
        return apiurl

    return None


def _collab_get_config_parser():
    global _osc_collab_config_parser
    global _osc_collab_osc_conffile

    if _osc_collab_config_parser is not None:
        return _osc_collab_config_parser

    conffile = _osc_collab_osc_conffile
    _osc_collab_config_parser = configparser.SafeConfigParser()
    _osc_collab_config_parser.read(conffile)
    return _osc_collab_config_parser


def _collab_get_config(apiurl, key, default = None):
    config = _collab_get_config_parser()
    if not config:
        return default

    apiurl = _collab_get_compatible_apiurl_for_config(config, apiurl)
    if apiurl and config.has_option(apiurl, key):
        return config.get(apiurl, key)
    elif config.has_option('general', key):
        return config.get('general', key)
    else:
        return default


def _collab_get_config_bool(apiurl, key, default = None):
    value = _collab_get_config(apiurl, key, default)
    if type(value) == bool:
        return value

    if value.lower() in [ 'true', 'yes' ]:
        return True
    try:
        return int(value) != 0
    except:
        pass
    return False

def _collab_get_config_list(apiurl, key, default = None):
    def split_items(line):
        items = line.split(';')
        # remove all empty items
        while True:
            try:
                items.remove('')
            except ValueError:
                break
        return items

    line = _collab_get_config(apiurl, key, default)

    items = split_items(line)
    if not items and default:
        if type(default) == str:
            items = split_items(default)
        else:
            items = default
    return items


#######################################################################


def _collab_migrate_gnome_config(apiurl):
    for key in [ 'archs', 'apiurl', 'email', 'projects' ]:
        if _collab_get_config(apiurl, 'collab_' + key) is not None:
            continue
        elif not ('gnome_' + key) in conf.config:
            continue
        _collab_add_config_option(apiurl, 'collab_' + key, conf.config['gnome_' + key])

    # migrate repo to repos
    if _collab_get_config(apiurl, 'collab_repos') is None and 'gnome_repo' in conf.config:
        _collab_add_config_option(apiurl, 'collab_repos', conf.config['gnome_repo'] + ';')


#######################################################################


def _collab_ensure_email(apiurl):
    email = _collab_get_config(apiurl, 'email')
    if email:
        return email
    email = _collab_get_config(apiurl, 'collab_email')
    if email:
        return email

    email =  raw_input('E-mail address to use for .changes entries: ')
    if email == '':
        return 'EMAIL@DOMAIN'

    _collab_add_config_option(apiurl, 'collab_email', email)

    return email


#######################################################################


def _collab_parse_arg_packages(packages):
    def remove_trailing_slash(s):
        if s.endswith('/'):
            return s[:-1]
        return s

    if type(packages) == str:
        return remove_trailing_slash(packages)
    elif type(packages) in [ list, tuple ]:
        return [ remove_trailing_slash(package) for package in packages ]
    else:
        return packages


#######################################################################


@cmdln.alias('gnome')
@cmdln.option('-A', '--apiurl', metavar='URL',
              dest='collapiurl',
              help='url to use to connect to the database (different from the build service server)')
@cmdln.option('--xs', '--exclude-submitted', action='store_true',
              dest='exclude_submitted',
              help='do not show submitted packages in the output')
@cmdln.option('--xr', '--exclude-reserved', action='store_true',
              dest='exclude_reserved',
              help='do not show reserved packages in the output')
@cmdln.option('--xc', '--exclude-commented', action='store_true',
              dest='exclude_commented',
              help='do not show commented packages in the output')
@cmdln.option('--xd', '--exclude-devel', action='store_true',
              dest='exclude_devel',
              help='do not show packages that are up-to-date in their development project in the output')
@cmdln.option('--ic', '--ignore-comments', action='store_true',
              dest='ignore_comments',
              help='ignore the comments')
@cmdln.option('--ir', '--ignore-reserved', action='store_true',
              dest='ignore_reserved',
              help='ignore the reservation state of the package if necessary')
@cmdln.option('--iu', '--include-upstream', action='store_true',
              dest='include_upstream',
              help='include reports about missing upstream data')
@cmdln.option('--nr', '--no-reserve', action='store_true',
              dest='no_reserve',
              help='do not reserve the package')
@cmdln.option('--ns', '--no-supersede', action='store_true',
              dest='no_supersede',
              help='do not supersede requests to the same package')
@cmdln.option('--nu', '--no-unreserve', action='store_true',
              dest='no_unreserve',
              help='do not unreserve the package')
@cmdln.option('--nodevelproject', action='store_true',
              dest='no_devel_project',
              help='do not use development project of the packages')
@cmdln.option('--nobranch', action='store_true',
              dest='no_branch',
              help='do not branch the package in your home project')
@cmdln.option('-m', '--message', metavar='TEXT',
              dest='msg',
              help='specify log message TEXT')
@cmdln.option('-f', '--forward', action='store_true',
              dest='forward',
              help='automatically forward to parent project if successful')
@cmdln.option('--no-details', action='store_true',
              dest='no_details',
              help='do not show more details')
@cmdln.option('--details', action='store_true',
              dest='details',
              help='show more details')
@cmdln.option('--project', metavar='PROJECT', action='append',
              dest='projects', default=[],
              help='project to work on (default: openSUSE:Factory)')
@cmdln.option('--repo', metavar='REPOSITORY', action='append',
              dest='repos', default=[],
              help='build repositories to build on (default: automatic detection)')
@cmdln.option('--arch', metavar='ARCH', action='append',
              dest='archs', default=[],
              help='architectures to build on (default: i586 and x86_64)')
@cmdln.option('--nc', '--no-cache', action='store_true',
              dest='no_cache',
              help='ignore data from the cache')
@cmdln.option('-v', '--version', action='store_true',
              dest='version',
              help='show version of the plugin')
def do_collab(self, subcmd, opts, *args):
    """${cmd_name}: Various commands to ease collaboration on the openSUSE Build Service.

    A tutorial and detailed documentation are available at:
      http://en.opensuse.org/openSUSE:Osc_Collab

    "todo" (or "t") will list the packages that need some action.

    "todoadmin" (or "ta") will list the packages from the project that need
    to be submitted to the parent project, and various other errors or tasks.

    "listreserved" (or "lr") will list the reserved packages.

    "isreserved" (or "ir") will look if a package is reserved.

    "reserve" (or "r") will reserve a package so other people know you're
    working on it.

    "unreserve" (or "u") will remove the reservation you had on a package.

    "listcommented" (or "lc") will list the commented packages.

    "comment" (or "c") will look if a package is commented.

    "commentset" (or "cs") will add to a package a comment you want to share
    with other people.

    "commentunset" (or "cu") will remove the comment you set on a package.

    "setup" (or "s") will prepare a package for work (possibly reservation,
    branch, checking out, etc.). The package will be checked out in the current
    directory.

    "update" (or "up") will prepare a package for update (possibly reservation,
    branch, checking out, download of the latest upstream tarball, .spec
    edition, etc.). The package will be checked out in the current directory.

    "forward" (or "f") will forward a request to the project to parent project.
    This includes the step of accepting the request first.

    "build" (or "b") will commit the local changes of the package in
    the current directory and wait for the build to succeed on the build
    service.

    "buildsubmit" (or "bs") will commit the local changes of the package in
    the current directory, wait for the build to succeed on the build service
    and if the build succeeds, submit the package to the development project.

    Usage:
        osc collab todo [--exclude-submitted|--xs] [--exclude-reserved|--xr] [--exclude-commented|--xc] [--exclude-devel|--xd] [--ignore-comments|--ic] [--details|--no-details] [--project=PROJECT]
        osc collab todoadmin [--include-upstream|--iu] [--project=PROJECT]

        osc collab listreserved
        osc collab isreserved [--nodevelproject] [--project=PROJECT] PKG [...]
        osc collab reserve [--nodevelproject] [--project=PROJECT] PKG [...]
        osc collab unreserve [--nodevelproject] [--project=PROJECT] PKG [...]

        osc collab listcommented
        osc collab comment [--nodevelproject] [--project=PROJECT] PKG [...]
        osc collab commentset [--nodevelproject] [--project=PROJECT] PKG COMMENT
        osc collab commentunset [--nodevelproject] [--project=PROJECT] PKG [...]

        osc collab setup [--ignore-reserved|--ir] [--ignore-comments|--ic] [--no-reserve|--nr] [--nodevelproject] [--nobranch] [--project=PROJECT] PKG
        osc collab update [--ignore-reserved|--ir] [--ignore-comments|--ic] [--no-reserve|--nr] [--nodevelproject] [--nobranch] [--project=PROJECT] PKG

        osc collab forward [--no-supersede|--ns] [--project=PROJECT] ID

        osc collab build [--message=TEXT|-m=TEXT] [--repo=REPOSITORY] [--arch=ARCH]
        osc collab buildsubmit [--forward|-f] [--no-supersede|--ns] [--no-unreserve|--nu] [--message=TEXT|-m=TEXT] [--repo=REPOSITORY] [--arch=ARCH]
    ${cmd_option_list}
    """

    # uncomment this when profiling is needed
    #self.ref = time.time()
    #print("%.3f - %s" % (time.time()-self.ref, 'start'))

    global _osc_collab_alias
    global _osc_collab_osc_conffile

    _osc_collab_alias = 'collab'

    if opts.version:
        print(OSC_COLLAB_VERSION)
        return

    cmds = ['todo', 't', 'todoadmin', 'ta', 'listreserved', 'lr', 'isreserved', 'ir', 'reserve', 'r', 'unreserve', 'u', 'listcommented', 'lc', 'comment', 'c', 'commentset', 'cs', 'commentunset', 'cu', 'setup', 's', 'update', 'up', 'forward', 'f', 'build', 'b', 'buildsubmit', 'bs']
    if not args or args[0] not in cmds:
        raise oscerr.WrongArgs('Unknown %s action. Choose one of %s.' % (_osc_collab_alias, ', '.join(cmds)))

    cmd = args[0]

    # Check arguments validity
    if cmd in ['listreserved', 'lr', 'listcommented', 'lc', 'todo', 't', 'todoadmin', 'ta', 'build', 'b', 'buildsubmit', 'bs']:
        min_args, max_args = 0, 0
    elif cmd in ['setup', 's', 'update', 'up', 'forward', 'f']:
        min_args, max_args = 1, 1
    elif cmd in ['commentset', 'cs']:
        min_args, max_args = 1, 2
    elif cmd in ['isreserved', 'ir', 'reserve', 'r', 'unreserve', 'u', 'comment', 'c', 'commentunset', 'cu']:
        min_args = 1
        max_args = sys.maxsize
    else:
        raise RuntimeError('Unknown command: %s' % cmd)

    if len(args) - 1 < min_args:
        raise oscerr.WrongArgs('Too few arguments.')
    if len(args) - 1 > max_args:
        raise oscerr.WrongArgs('Too many arguments.')

    if opts.details and opts.no_details:
        raise oscerr.WrongArgs('--details and --no-details cannot be used at the same time.')

    apiurl = self.get_api_url()
    user = conf.get_apiurl_usr(apiurl)
    userdata = core.get_user_data(apiurl, user, *['realname', 'email'])
    realname = userdata[0]
    email = userdata[1]

    # See get_config() in osc/conf.py and postoptparse() in
    # osc/commandline.py
    if self.options.conffile:
        conffile = self.options.conffile
    else:
        conffile = conf.identify_conf()
    _osc_collab_osc_conffile = os.path.expanduser(conffile)

    if opts.collapiurl:
        collab_apiurl = opts.collapiurl
    else:
        collab_apiurl = _collab_get_config(apiurl, 'collab_apiurl')

    if len(opts.projects) != 0:
        projects = opts.projects
    else:
        projects = _collab_get_config_list(apiurl, 'collab_projects', 'openSUSE:Factory')

    if len(opts.repos) != 0:
        repos = opts.repos
    else:
        repos = _collab_get_config_list(apiurl, 'collab_repos', '!autodetect!')

    if len(opts.archs) != 0:
        archs = opts.archs
    else:
        archs = _collab_get_config_list(apiurl, 'collab_archs', 'i586;x86_64;')

    details = _collab_get_config_bool(apiurl, 'collab_details', False)
    if details and opts.no_details:
        details = False
    elif not details and opts.details:
        details = True

    OscCollabApi.init(collab_apiurl)
    OscCollabCache.init(opts.no_cache)
    OscCollabObs.init(apiurl)

    # Do the command
    if cmd in ['todo', 't']:
        _collab_todo(apiurl, projects, details, opts.ignore_comments, opts.exclude_commented, opts.exclude_reserved, opts.exclude_submitted, opts.exclude_devel)

    elif cmd in ['todoadmin', 'ta']:
        _collab_todoadmin(apiurl, projects, opts.include_upstream)

    elif cmd in ['listreserved', 'lr']:
        _collab_listreserved(projects)

    elif cmd in ['isreserved', 'ir']:
        packages = _collab_parse_arg_packages(args[1:])
        _collab_isreserved(projects, packages, no_devel_project = opts.no_devel_project)

    elif cmd in ['reserve', 'r']:
        packages = _collab_parse_arg_packages(args[1:])
        _collab_reserve(projects, packages, user, no_devel_project = opts.no_devel_project)

    elif cmd in ['unreserve', 'u']:
        packages = _collab_parse_arg_packages(args[1:])
        _collab_unreserve(projects, packages, user, no_devel_project = opts.no_devel_project)

    elif cmd in ['listcommented', 'lc']:
        _collab_listcommented(projects)

    elif cmd in ['comment', 'c']:
        packages = _collab_parse_arg_packages(args[1:])
        _collab_comment(projects, packages, no_devel_project = opts.no_devel_project)

    elif cmd in ['commentset', 'cs']:
        packages = _collab_parse_arg_packages(args[1])
        if len(args) - 1 == 1:
            comment = edit_message()
        else:
            comment = args[2]
        _collab_commentset(projects, packages, user, comment, no_devel_project = opts.no_devel_project)

    elif cmd in ['commentunset', 'cu']:
        packages = _collab_parse_arg_packages(args[1:])
        _collab_commentunset(projects, packages, user, no_devel_project = opts.no_devel_project)

    elif cmd in ['setup', 's']:
        package = _collab_parse_arg_packages(args[1])
        _collab_setup(apiurl, user, projects, package, ignore_reserved = opts.ignore_reserved, ignore_comment = opts.ignore_comments, no_reserve = opts.no_reserve, no_devel_project = opts.no_devel_project, no_branch = opts.no_branch)

    elif cmd in ['update', 'up']:
        package = _collab_parse_arg_packages(args[1])
        _collab_update(apiurl, user, realname, email, projects, package, ignore_reserved = opts.ignore_reserved, ignore_comment = opts.ignore_comments, no_reserve = opts.no_reserve, no_devel_project = opts.no_devel_project, no_branch = opts.no_branch)

    elif cmd in ['forward', 'f']:
        request_id = args[1]
        _collab_forward(apiurl, user, projects, request_id, no_supersede = opts.no_supersede)

    elif cmd in ['build', 'b']:
        _collab_build(apiurl, user, projects, opts.msg, repos, archs)

    elif cmd in ['buildsubmit', 'bs']:
        _collab_build_submit(apiurl, user, projects, opts.msg, repos, archs, forward = opts.forward, no_unreserve = opts.no_unreserve, no_supersede = opts.no_supersede)

    else:
        raise RuntimeError('Unknown command: %s' % cmd)
07070100000006000041ED0000000000000000000000026548EB8C00000000000000000000000000000000000000000000002200000000osc-plugin-collab-0.104+30/server07070100000007000081A40000000000000000000000016548EB8C00001283000000000000000000000000000000000000002900000000osc-plugin-collab-0.104+30/server/READMEThis directory contains the script used to create the server-side
database used by the osc collab plugin, as well as the web API used by
the plugin.

All data created will be created in ./cache/ unless the cache-dir option
in a configuration file is set. You can choose the configuration file
via the -o command-line option, or via the OBS_OPTIONS environment
variable. The format of the file is the one described in
obs-db/data/defaults.conf.

There are runme scripts that should help people get started. Ideally,
those scripts would be run in cron jobs:

 ./obs-db/runme: a good period should be between every 10 minutes and
                 every 30 minutes (it has to be tweaked)

 ./obs-db/runme-attributes: a good period should be every 30 minutes

 ./upstream/runme: a good period should be every 30 minutes


Note: if the script seems to be hanging forever, then it's likely the
following python bug: http://bugs.python.org/issue5103
See the analysis at https://bugzilla.novell.com/show_bug.cgi?id=525295
The solution is to edit ssl.py, as mentioned in the above bugs.


#######################################################################


obs-db/
    Contains a script that will checks out the relevant data from the
    build service, and creates a database for osc collab. It was
    designed to not be too expensive in resources for the build service,
    by using hermes to know what has changed in the past.

    The default configuration file data/defaults.conf is a good
    introduction on how to use this script (what to check out, eg). An
    example configuration, data/opensuse.conf, is available: it is the
    one used to create the osc collab openSUSE database, which means it
    will check out a lot of projects.

    All the data will be put in a cache subdirectory, unless the
    cache-dir option is set in a configuration file.


    Here is a quick overview of the structure of the code:

        data/defaults.conf:
        Configuration file documenting the configuration format and the
        defaults value.

        data/opensuse.conf:
        Configuration file used to create the database for the osc
        collab openSUSE database. It can be used with the --opensuse
        option of obs-db, or with the -s option of runme.

        buildservice.py:
        Checks out data from the build service.

        config.py:
        Handles the configuration.

        database.py:
        Creates a database based on the checkout and the upstream data.

        hermes.py:
        Looks at hermes feeds to know what has changed.

        obs-db:
        Executable script that will call shell.py.

        obs-manual-checkout:
        Executable script to manually download a project or package,
        in the same way as obs-db does.

        obs-upstream-attributes:
        Executable script to update the openSUSE:UpstreamVersion and
        openSUSE:UpstreamTarballURL attributes in the build service with the
        upstream data we have.

        osc_copy.py:
        Some convenience functions copied from osc.

        runme:
        Small helper to launch the script.

        runme-attributes:
        Small helper to launch the obs-upstream-attributes script.

        shell.py:
        Shell part of the script.

        shellutils.py:
        Miscellaneous functions to create an application using those modules.

        TODO:
        List of things to do :-)

        upstream.py:
        Creates a database about upstream data.

        util.py:
        Miscellaneous functions that make life easier.


openSUSE-setup/
    Contains a near real-life example of how the server is setup for
    osc-collab.opensuse.org. See openSUSE-setup/README for more
    information.


upstream/
    download-upstream-versions:
        Download information about the latest versions of various
        modules that are not hosted on ftp.gnome.org.

    runme:
    Small helper to update the upstream data.

    upstream-limits.txt:
        Information about limits for upstream versions. We might not
        want to look for unstable versions of a module, eg.

    upstream-packages-match.txt:
        Data file that matches upstream GNOME module names to openSUSE
        source package names.

    upstream-tarballs.txt:
        Information about where to find the latest upstream tarballs.


web/
    Contains the web API that the osc collab plugin connects to. To
    deploy it, simply install the files on a web server, copy
    libdissector/config.py.in to libdissector/config.py and edit it for
    the configuration.

    Note that there is a .htaccess file in libdissector/. You will want
    to check it will not be ignored because of the apache configuration
    before deploying.
07070100000008000041ED0000000000000000000000026548EB8C00000000000000000000000000000000000000000000002900000000osc-plugin-collab-0.104+30/server/obs-db07070100000009000081A40000000000000000000000016548EB8C00000548000000000000000000000000000000000000002E00000000osc-plugin-collab-0.104+30/server/obs-db/TODOhermes:
 + when there's a commit for a package, we should create a fake commit event
   for all packages linking to this one, so that they get updated too.

buildservice:
 + if we queue something twice, then we will really do it twice. We should have
   a way to queue, and then strip the useless elements to minimize data.
 + a project configuration cache can be written twice (eg, when a project is
   devel project for two different projects). This should be avoided since we
   probably expect the first write to win (assuming the first write matches the
   first project in obs.conf).
 + in check_project(), we should probably also look at files that are checked
   out for a package and remove the ones that shouldn't be there.

database:
 + only do one update call if there was a commit and meta change for a package
 + provide the API needed by infoxml so that it doesn't need to do any SQL
   query?

upstream:
 + add a removed table: right now, obs-upstream-attributes has no way to know
   what upstream metadata got removed, which might result in stale attributes.

general:
!+ add a --deep-check mode. It would use the buildservice.py infrastructure to
   browse all packages of all projects, and check that the files all have the
   right md5. If not, it would queue packages for the next run. Or create tool
   called obs-offline-check.
0707010000000A000081A40000000000000000000000016548EB8C00009E60000000000000000000000000000000000000003900000000osc-plugin-collab-0.104+30/server/obs-db/buildservice.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2008-2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os
import sys

import bisect
import errno
import hashlib
import optparse
import shutil
import socket
import tempfile
import time
from osc import core
import urllib.parse, urllib.error

import queue
import threading

try:
    from lxml import etree as ET
except ImportError:
    try:
        from xml.etree import cElementTree as ET
    except ImportError:
        import cElementTree as ET

import osc_copy
import util

# Timeout for sockets
SOCKET_TIMEOUT = 30

# Debug output?
USE_DEBUG = False
DEBUG_DIR = 'debug'


#######################################################################


def debug_thread(context, state, indent = '', use_remaining = False):
    global USE_DEBUG
    global DEBUG_DIR

    if not USE_DEBUG:
        return

    # compatibility with old versions of python (< 2.6)
    if hasattr(threading.currentThread(), 'name'):
        name = threading.currentThread().name
    else:
        name = threading.currentThread().getName()

    if context == 'main':
        print('%s%s: %s' % (indent, name, state))
        return

    try:
        util.safe_mkdir_p(DEBUG_DIR)
        fout = open(os.path.join(DEBUG_DIR, 'buildservice-' + name), 'a')

        # ignore indent since we write in files
        fout.write('[%s] %s %s\n' % (context, time.strftime("%H:%M:%S", time.localtime()), state))

        if use_remaining:
            remaining = ''
            for i in threading.enumerate():
                remaining += i.name + ', '
            fout.write('Remaining: %s\n' % (remaining,))
        fout.close()
    except Exception as e:
        print('Exception in debug_thread: %s' % (e,), file=sys.stderr)


def socket_closer_thread_run(obs_checkout, empty_event):
    # compatibility with old versions of python (< 2.6)
    if hasattr(empty_event, 'is_set'):
        empty_is_set = empty_event.is_set
    else:
        empty_is_set = empty_event.isSet

    while True:
        if empty_is_set():
            break

        obs_checkout.socket_timeouts_acquire()

        # Find the socket that is supposed to be closed the first, so we can
        # monitor it
        while True:
            if not len(obs_checkout.socket_timeouts):
                (max_time, current_socket) = (0, None)
                break

            (max_time, current_socket, url) = obs_checkout.socket_timeouts[0]

            if time.time() + SOCKET_TIMEOUT + 1 < max_time:
                debug_thread('monitor', 'closing socket for %s (too far)' % url)
                # close this socket: the max time is way too high
                current_socket.close()
                obs_checkout.socket_timeouts.remove((max_time, current_socket, url))
            else:
                break

        obs_checkout.socket_timeouts_release()

        # There's a socket to monitor, let's just do it
        if max_time > 0:

            while time.time() < max_time:
                time.sleep(1)
                # This is not thread-safe, but it can only go from False to
                # True.
                # If the value is still False, then we'll just try another
                # time (and worst case: we exit the loop because of the
                # timeout, but then we acquire the lock so the end will be
                # thread-safe: we won't close it twice).
                # If the value is True, then it's really closed anyway.
                if not current_socket.fp or current_socket.fp.closed:
                    break

            obs_checkout.socket_timeouts_acquire()
            if time.time() >= max_time:
                debug_thread('monitor', 'closing socket for %s (timed out)' % url)
                current_socket.close()
            if (max_time, current_socket, url) in obs_checkout.socket_timeouts:
                 obs_checkout.socket_timeouts.remove((max_time, current_socket, url))
            obs_checkout.socket_timeouts_release()

        else:
            # There's no socket to monitor at the moment, so we wait for one to
            # appear or for the notification of the end of the work.
            # We use less than the socket timeout value as timeout so we are
            # sure to not start too late the monitoring of the next socket (so
            # we don't allow a socket to stay more than its timeout).
            empty_event.wait(SOCKET_TIMEOUT / 2)


def obs_checkout_thread_run(obs_checkout):
    try:
        while True:
            debug_thread('thread_loop', 'start loop', use_remaining = True)
            if obs_checkout.queue.empty():
                break

            debug_thread('thread_loop', 'getting work...')
            # we don't want to block: the queue is filled at the beginning and
            # once it's empty, then it means we're done. So we want the
            # exception to happen.
            (project, package, meta) = obs_checkout.queue.get(block = False)
            debug_thread('main', 'starting %s/%s (meta: %d)' % (project, package, meta))

            try:
                debug_thread('thread_loop', 'work = %s/%s (meta: %d)' % (project, package, meta))
                if not package:
                    if meta:
                        obs_checkout.checkout_project_pkgmeta(project)
                    else:
                        obs_checkout.check_project(project)
                else:
                    if meta:
                        obs_checkout.checkout_package_meta(project, package)
                    else:
                        obs_checkout.checkout_package(project, package)
                debug_thread('thread_loop', 'work done')
            except Exception as e:
                print('Exception in worker thread for %s/%s (meta: %d): %s' % (project, package, meta, e), file=sys.stderr)

            obs_checkout.queue.task_done()
            debug_thread('thread_loop', 'end loop', use_remaining = True)
    except queue.Empty:
        pass
    debug_thread('thread_loop', 'exit loop', use_remaining = True)


#######################################################################


class ObsCheckout:

    def __init__(self, conf, dest_dir):
        global USE_DEBUG
        global DEBUG_DIR
        global SOCKET_TIMEOUT

        USE_DEBUG = conf.debug
        DEBUG_DIR = os.path.join(conf.cache_dir, 'debug')
        SOCKET_TIMEOUT = conf.threads_sockettimeout

        self.conf = conf
        self.dest_dir = dest_dir

        self.queue = queue.Queue()
        self.queue2 = queue.Queue()
        self.error_queue = queue.Queue()
        self.errors = set()
        self.socket_timeouts = []
        self.socket_timeouts_lock = None


    def socket_timeouts_acquire(self):
        if self.socket_timeouts_lock:
            debug_thread('lock', 'acquiring lock')
            self.socket_timeouts_lock.acquire()
            debug_thread('lock', 'acquired lock')


    def socket_timeouts_release(self):
        if self.socket_timeouts_lock:
            self.socket_timeouts_lock.release()
            debug_thread('lock', 'released lock')


    def _download_url_to_file(self, url, file):
        """ Download url to file.

            Return the length of the downloaded file.

        """
        fin = None
        fout = None
        timeout = 0
        length = 0
        try:
            fin = core.http_GET(url)
            fout = open(file, 'w')

            bytes = fin.read()
            cur_length = len(bytes)
            fout.write(bytes.decode())
            fout.close()

            return length

        except Exception as e:
            debug_thread('url', 'exception: %s' % (e,), ' ')

            self.socket_timeouts_acquire()
            if (timeout, fin, url) in self.socket_timeouts:
                self.socket_timeouts.remove((timeout, fin, url))
            if fin:
                fin.close()
            self.socket_timeouts_release()

            if fout:
                fout.close()
            raise e


    def _get_file(self, project, package, filename, size, revision = None, try_again = True):
        """ Download a file of a package. """
        package_dir = os.path.join(self.dest_dir, project, package)
        destfile = os.path.join(package_dir, filename)
        tmpdestfile = destfile + '.new'

        try:
            query = None
            if revision:
                query = { 'rev': revision }
            url = osc_copy.makeurl(self.conf.apiurl, ['public', 'source', project, package, urllib.request.pathname2url(filename)], query=query)
            length = self._download_url_to_file(url, tmpdestfile)

            if length != size:
                if try_again:
                    util.safe_unlink(tmpdestfile)
                    return self._get_file(project, package, filename, size, revision, False)

            os.rename(tmpdestfile, destfile)

        except (urllib.error.HTTPError, urllib.error.URLError, socket.error) as e:
            util.safe_unlink(tmpdestfile)

            if type(e) == urllib.error.HTTPError and e.code == 404:
                print('File in package %s of project %s doesn\'t exist.' % (filename, package, project), file=sys.stderr)
            elif try_again:
                self._get_file(project, package, filename, size, revision, False)
            else:
                print('Cannot get file %s for %s from %s: %s (queueing for next run)' % (filename, package, project, e), file=sys.stderr)
                self.error_queue.put((project, package))

            return


    def _get_files_metadata(self, project, package, save_basename, revision = None, try_again = True):
        """ Download the file list of a package. """
        package_dir = os.path.join(self.dest_dir, project, package)
        filename = os.path.join(package_dir, save_basename)
        tmpfilename = filename + '.new'

        # download files metadata
        try:
            query = None
            if revision:
                query = { 'rev': revision }
            url = osc_copy.makeurl(self.conf.apiurl, ['public', 'source', project, package], query=query)
            length = self._download_url_to_file(url, tmpfilename)

            if length == 0:
                # metadata files should never be empty
                if try_again:
                    util.safe_unlink(tmpfilename)
                    return self._get_files_metadata(project, package, save_basename, revision, False)

            os.rename(tmpfilename, filename)

        except (urllib.error.HTTPError, urllib.error.URLError, socket.error) as e:
            util.safe_unlink(tmpfilename)

            if type(e) == urllib.error.HTTPError and e.code == 404:
                print('Package %s doesn\'t exist in %s.' % (package, project), file=sys.stderr)
            elif try_again:
                return self._get_files_metadata(project, package, save_basename, revision, False)
            elif revision:
                print('Cannot download file list of %s from %s with specified revision: %s' % (package, project, e), file=sys.stderr)
            else:
                print('Cannot download file list of %s from %s: %s (queueing for next run)' % (package, project, e), file=sys.stderr)
                self.error_queue.put((project, package))

            return None

        try:
            return ET.parse(filename).getroot()
        except SyntaxError as e:
            if try_again:
                os.unlink(filename)
                return self._get_files_metadata(project, package, save_basename, revision, False)
            elif revision:
                print('Cannot parse file list of %s from %s with specified revision: %s' % (package, project, e), file=sys.stderr)
            else:
                print('Cannot parse file list of %s from %s: %s' % (package, project, e), file=sys.stderr)
            return None


    def _get_package_metadata_cache(self, project, package):
        """ Get the (md5, mtime) metadata from currently checkout data.

            We take the metadata from the expanded link first, and also loads
            the metadata from the non-expanded link (which overloads the
            previous one).

        """
        def add_metadata_from_file(file, cache):
            if not os.path.exists(file):
                return

            try:
                root = ET.parse(file).getroot()
            except SyntaxError:
                return

            # also get the md5 of the directory
            cache[os.path.basename(file)] = (root.get('srcmd5'), '')

            for node in root.findall('entry'):
                cache[node.get('name')] = (node.get('md5'), node.get('mtime'))

        package_dir = os.path.join(self.dest_dir, project, package)
        cache = {}

        files = os.path.join(package_dir, '_files-expanded')
        add_metadata_from_file(files, cache)
        files = os.path.join(package_dir, '_files')
        add_metadata_from_file(files, cache)

        return cache


    def _get_hash_from_file(self, algo, path):
        """ Return the hash of a file, using the specified algorithm. """
        if not os.path.exists(path):
            return None

        if algo not in [ 'md5' ]:
            print('Internal error: _get_hash_from_file called with unknown hash algorithm: %s' % algo, file=sys.stderr)
            return None

        hash = hashlib.new(algo)
        file = open(path, 'rb')
        while True:
            data = file.read(32768)
            if not data:
                break
            hash.update(data)
        file.close()
        return hash.hexdigest()


    def _get_package_file_checked_out(self, project, package, filename, cache, md5, mtime):
        """ Tells if a file of the package is already checked out. """
        if filename not in cache:
            return False
        if cache[filename] != (md5, mtime):
            return False

        path = os.path.join(self.dest_dir, project, package, filename)
        file_md5 = self._get_hash_from_file('md5', path)
        return file_md5 != None and file_md5 == md5


    def _cleanup_package_old_files(self, project, package, downloaded_files):
        """ Function to remove old files that should not be in a package
            checkout anymore.

            This should be called before all return statements in
            checkout_package. 

        """
        package_dir = os.path.join(self.dest_dir, project, package)
        for file in os.listdir(package_dir):
            if file in downloaded_files:
                continue
            os.unlink(os.path.join(package_dir, file))


    def checkout_package(self, project, package):
        """ Checks out a package.

            We use the files already checked out as a cache, to avoid
            downloading the same files again if possible.

            This means we need to make sure to remove all files that shouldn't
            be there when leaving this function. This is done with the calls to
            _cleanup_package_old_files().

        """
        if not package:
            print('Internal error: checkout_package called instead of checkout_project_pkgmeta', file=sys.stderr)
            self.checkout_project_pkgmeta(project)
            return

        package_dir = os.path.join(self.dest_dir, project, package)
        util.safe_mkdir_p(package_dir)

        # Never remove _meta files, since they're not handled by the checkout process
        downloaded_files = [ '_meta' ]

        metadata_cache = self._get_package_metadata_cache(project, package)

        # find files we're interested in from the metadata
        root = self._get_files_metadata(project, package, '_files')
        downloaded_files.append('_files')
        if root is None:
            self._cleanup_package_old_files(project, package, downloaded_files)
            return

        is_link = False
        link_error = False
        # revision to expand a link
        link_md5 = None

        # detect if the package is a link package
        linkinfos_nb = len(root.findall('linkinfo'))
        if linkinfos_nb == 1:
            link_node = root.find('linkinfo')
            # The logic is taken from islink() in osc/core.py
            is_link = link_node.get('xsrcmd5') not in [ None, '' ] or link_node.get('lsrcmd5') not in [ None, '' ]
            link_error = link_node.get('error') not in [ None, '' ]
            link_md5 = link_node.get('xsrcmd5')
        elif linkinfos_nb > 1:
            print('Ignoring link in %s from %s: more than one <linkinfo>' % (package, project), file=sys.stderr)

        if is_link:
            # download the _link file first. This makes it possible to know if
            # the project has a delta compared to the target of the link
            for node in root.findall('entry'):
                filename = node.get('name')
                md5 = node.get('md5')
                mtime = node.get('mtime')
                size = node.get('size')
                if filename == '_link':
                    if not self._get_package_file_checked_out(project, package, filename, metadata_cache, md5, mtime):
                        self._get_file(project, package, filename, size)
                    downloaded_files.append(filename)

            # if the link has an error, then we can't do anything else since we
            # won't be able to expand
            if link_error:
                self._cleanup_package_old_files(project, package, downloaded_files)
                return

            # look if we need to download the metadata of the expanded package
            if '_files-expanded' in metadata_cache and metadata_cache['_files-expanded'][0] == link_md5:
                files = os.path.join(self.dest_dir, project, package, '_files-expanded')
                try:
                    root = ET.parse(files).getroot()
                except SyntaxError:
                    root = None
            else:
                root = self._get_files_metadata(project, package, '_files-expanded', link_md5)

            if root is None:
                self._cleanup_package_old_files(project, package, downloaded_files)
                return

            downloaded_files.append('_files-expanded')

        # look at all files and download what might be interesting
        for node in root.findall('entry'):
            filename = node.get('name')
            md5 = node.get('md5')
            mtime = node.get('mtime')
            size = node.get('size')
            # download .spec files
            if filename.endswith('.spec'):
                if not self._get_package_file_checked_out(project, package, filename, metadata_cache, md5, mtime):
                    self._get_file(project, package, filename, size, link_md5)
                downloaded_files.append(filename)

        self._cleanup_package_old_files(project, package, downloaded_files)


    def checkout_package_meta(self, project, package, try_again = True):
        """ Checks out the metadata of a package.
        
            If we're interested in devel projects of this project, and the
            devel package is not in a checked out devel project, then we queue
            a checkout of this devel project.

        """
        package_dir = os.path.join(self.dest_dir, project, package)
        util.safe_mkdir_p(package_dir)

        filename = os.path.join(package_dir, '_meta')
        tmpfilename = filename + '.new'

        try:
            url = osc_copy.makeurl(self.conf.apiurl, ['public', 'source', project, package, '_meta'])
            length = self._download_url_to_file(url, tmpfilename)

            if length == 0:
                # metadata files should never be empty
                if try_again:
                    util.safe_unlink(tmpfilename)
                    return self.checkout_package_meta(project, package, False)

            os.rename(tmpfilename, filename)

        except (urllib.error.HTTPError, urllib.error.URLError, socket.error) as e:
            util.safe_unlink(tmpfilename)

            if type(e) == urllib.error.HTTPError and e.code == 404:
                print('Package %s of project %s doesn\'t exist.' % (package, project), file=sys.stderr)
            elif try_again:
                self.checkout_package_meta(project, package, False)
            else:
                print('Cannot get metadata of package %s in %s: %s (queueing for next run)' % (package, project, e), file=sys.stderr)
                self.error_queue.put((project, package))

            return

        # Are we interested in devel projects of this project, and if yes,
        # should we check out the devel project if needed?
        if project not in self.conf.projects:
            return
        if not self.conf.projects[project].checkout_devel_projects:
            return

        try:
            package_node = ET.parse(filename).getroot()
        except SyntaxError:
            return
 
        devel_node = package_node.find('devel')
        if devel_node is None:
            return

        devel_project = devel_node.get('project')
        project_dir = os.path.join(self.dest_dir, devel_project)
        if not os.path.exists(project_dir):
            self.queue_checkout_project(devel_project, parent = project, primary = False)


    def check_project(self, project, try_again = True):
        """ Checks if the current checkout of a project is up-to-date, and queue task if necessary. """
        project_dir = os.path.join(self.dest_dir, project)
        util.safe_mkdir_p(project_dir)

        filename = os.path.join(project_dir, '_status')

        try:
            url = osc_copy.makeurl(self.conf.apiurl, ['status', 'project', project])
            length = self._download_url_to_file(url, filename)

            if length == 0:
                # metadata files should never be empty
                if try_again:
                    util.safe_unlink(filename)
                    return self.check_project(project, False)

        except (urllib.error.HTTPError, urllib.error.URLError, socket.error) as e:
            util.safe_unlink(filename)

            if type(e) == urllib.error.HTTPError:
                if e.code == 404:
                    print('Project %s doesn\'t exist.' % (project,), file=sys.stderr)
                elif e.code == 400:
                    # the status page doesn't always work :/
                    self.queue_checkout_project(project, primary = False, force_simple_checkout = True, no_config = True)
            elif try_again:
                self.check_project(project, False)
            else:
                print('Cannot get status of %s: %s' % (project, e), file=sys.stderr)

            return

        try:
            packages_node = ET.parse(filename).getroot()
        except SyntaxError as e:
            util.safe_unlink(filename)

            if try_again:
                return self.check_project(project, False)
            else:
                print('Cannot parse status of %s: %s' % (project, e), file=sys.stderr)

            return

        # We will have to remove all subdirectories that just don't belong to
        # this project anymore.
        subdirs_to_remove = [ file for file in os.listdir(project_dir) if os.path.isdir(os.path.join(project_dir, file)) ]

        # Here's what we check to know if a package needs to be checked out again:
        #  - if there's no subdir
        #  - if it's a link:
        #    - check that the md5 from the status is the xsrcmd5 from the file
        #      list
        #    - check that we have _files-expanded and that all spec files are
        #      checked out
        #  - if it's not a link: check that the md5 from the status is the
        #    srcmd5 from the file list
        for node in packages_node.findall('package'):
            name = node.get('name')
            srcmd5 = node.get('srcmd5')
            is_link = len(node.findall('link')) > 0

            try:
                subdirs_to_remove.remove(name)
            except ValueError:
                pass

            files = os.path.join(project_dir, name, '_files')
            if not os.path.exists(files):
                self.queue_checkout_package(project, name, primary = False)
                continue

            try:
                files_root = ET.parse(files).getroot()
            except SyntaxError:
                self.queue_checkout_package(project, name, primary = False)
                continue

            if is_link:
                previous_srcmd5 = files_root.get('xsrcmd5')
            else:
                previous_srcmd5 = files_root.get('srcmd5')

            if srcmd5 != previous_srcmd5:
                self.queue_checkout_package(project, name, primary = False)

            # make sure we have all spec files

            if is_link:
                # for links, we open the list of files when expanded
                files = os.path.join(project_dir, name, '_files-expanded')
                if not os.path.exists(files):
                    self.queue_checkout_package(project, name, primary = False)
                    continue

                try:
                    files_root = ET.parse(files).getroot()
                except SyntaxError:
                    self.queue_checkout_package(project, name, primary = False)
                    continue

            cont = False
            for entry in files_root.findall('entry'):
                filename = entry.get('name')
                if filename.endswith('.spec'):
                    specfile = os.path.join(project_dir, name, filename)
                    if not os.path.exists(specfile):
                        self.queue_checkout_package(project, name, primary = False)
                        cont = True
                        break
            if cont:
                continue

        # Remove useless subdirectories
        for subdir in subdirs_to_remove:
            shutil.rmtree(os.path.join(project_dir, subdir))

        util.safe_unlink(filename)


    def checkout_project_pkgmeta(self, project, try_again = True):
        """ Checks out the packages metadata of all packages in a project. """
        project_dir = os.path.join(self.dest_dir, project)
        util.safe_mkdir_p(project_dir)

        filename = os.path.join(project_dir, '_pkgmeta')
        tmpfilename = filename + '.new'

        try:
            url = osc_copy.makeurl(self.conf.apiurl, ['search', 'package'], ['match=%s' % urllib.parse.quote('@project=\'%s\'' % project)])
            length = self._download_url_to_file(url, tmpfilename)

            if length == 0:
                # metadata files should never be empty
                if try_again:
                    util.safe_unlink(tmpfilename)
                    return self.checkout_project_pkgmeta(project, False)

            os.rename(tmpfilename, filename)

        except (urllib.error.HTTPError, urllib.error.URLError, socket.error) as e:
            util.safe_unlink(tmpfilename)

            if type(e) == urllib.error.HTTPError and e.code == 404:
                print('Project %s doesn\'t exist.' % (project,), file=sys.stderr)
            elif try_again:
                self.checkout_project_pkgmeta(project, False)
            else:
                print('Cannot get packages metadata of %s: %s' % (project, e), file=sys.stderr)

            return


    def _run_helper(self):
        if self.socket_timeouts != []:
            print('Internal error: list of socket timeouts is not empty before running', file=sys.stderr)
            return
        # queue is empty or does not exist: it could be that the requested
        # project does not exist
        if self.queue.empty():
            return

        debug_thread('main', 'queue has %d items' % self.queue.qsize())

        if self.conf.threads > 1:
            # Architecture with threads:
            #  + we fill a queue with all the tasks that have to be done
            #  + the main thread does nothing until the queue is empty
            #  + we create a bunch of threads that will take the tasks from the
            #    queue
            #  + we create a monitor thread that ensures that the socket
            #    connections from the other threads don't hang forever. The
            #    issue is that those threads use urllib2, and urllib2 will
            #    remove the timeout from the underlying socket. (see
            #    socket.makefile() documentation)
            #  + there's an event between the main thread and the monitor
            #    thread to announce to the monitor thread that the queue is
            #    empty and that it can leave.
            #  + once the queue is empty:
            #    - the helper threads all exit since there's nothing left to do
            #    - the main thread is waken up and sends an event to the
            #      monitor thread. It waits for it to exit.
            #    - the monitor thread receives the event and exits.
            #    - the main thread can continue towards the end of the process.

            # this is used to signal the monitor thread it can exit
            empty_event = threading.Event()
            # this is the lock for the data shared between the threads
            self.socket_timeouts_lock = threading.Lock()

            if SOCKET_TIMEOUT > 0:
                monitor = threading.Thread(target=socket_closer_thread_run, args=(self, empty_event))
                monitor.start()

            thread_args = (self,)
            for i in range(min(self.conf.threads, self.queue.qsize())):
                t = threading.Thread(target=obs_checkout_thread_run, args=thread_args)
                t.start()

            self.queue.join()
            # tell the monitor thread to quit and wait for it
            empty_event.set()
            if SOCKET_TIMEOUT > 0:
                monitor.join()
        else:
            try:
                while not self.queue.empty():
                    (project, package, meta) = self.queue.get(block = False)
                    debug_thread('main', 'starting %s/%s' % (project, package))
                    if not package:
                        if meta:
                            obs_checkout.checkout_project_pkgmeta(project)
                        else:
                            obs_checkout.check_project(project)
                    else:
                        if meta:
                            obs_checkout.checkout_package_meta(project, package)
                        else:
                            obs_checkout.checkout_package(project, package)
            except queue.Empty:
                pass

        # secondary queue is not empty, so we do a second run
        if not self.queue2.empty():
            debug_thread('main', 'Working on second queue')
            self.queue = self.queue2
            self.queue2 = queue.Queue()
            self._run_helper()


    def run(self):
        # we need a helper since the helper can call itself, and we want to
        # look at the error queue at the very end
        self._run_helper()

        self.errors.clear()
        while not self.error_queue.empty():
            (project, package) = self.error_queue.get()
            self.errors.add((project, package or ''))


    def _write_project_config(self, project):
        """ We need to write the project config to a file, because nothing
            remembers if a project is a devel project, and from which project
            it is, so it's impossible to know what settings should apply
            without such a file. """
        if project not in self.conf.projects:
            return

        project_dir = os.path.join(self.dest_dir, project)
        util.safe_mkdir_p(project_dir)

        filename = os.path.join(project_dir, '_obs-db-options')

        fout = open(filename, 'w')
        fout.write('parent=%s\n' % self.conf.projects[project].parent)
        fout.write('branches=%s\n' % ','.join(self.conf.projects[project].branches))
        fout.write('force-project-parent=%d\n' % self.conf.projects[project].force_project_parent)
        fout.write('lenient-delta=%d\n' % self.conf.projects[project].lenient_delta)
        fout.close()


    def _copy_project_config(self, project, copy_from):
        from_file = os.path.join(self.dest_dir, copy_from, '_obs-db-options')
        if not os.path.exists(from_file):
            return

        project_dir = os.path.join(self.dest_dir, project)
        util.safe_mkdir_p(project_dir)

        filename = os.path.join(project_dir, '_obs-db-options')
        shutil.copy(from_file, filename)


    def _get_packages_in_project(self, project, try_again = True):
        project_dir = os.path.join(self.dest_dir, project)
        util.safe_mkdir_p(project_dir)

        filename = os.path.join(project_dir, '_pkglist')

        try:
            url = osc_copy.makeurl(self.conf.apiurl, ['public', 'source', project])
            length = self._download_url_to_file(url, filename)

            if length == 0:
                # metadata files should never be empty
                if try_again:
                    util.safe_unlink(filename)
                    return self._get_packages_in_project(project, False)

        except (urllib.error.HTTPError, urllib.error.URLError, socket.error) as e:
            util.safe_unlink(filename)

            if type(e) == urllib.error.HTTPError and e.code == 404:
                return (None, 'Project %s doesn\'t exist.' % (project,))
            elif try_again:
                return self._get_packages_in_project(project, False)
            else:
                return (None, str(e))

        try:
            root = ET.parse(filename).getroot()
        except SyntaxError as e:
            util.safe_unlink(filename)

            if try_again:
                return self._get_packages_in_project(project, False)
            else:
                return (None, 'Cannot parse list of packages in %s: %s' % (project, e))

        packages = [ node.get('name') for node in root.findall('entry') ]
        util.safe_unlink(filename)

        return (packages, None)


    def queue_pkgmeta_project(self, project, primary = True):
        if primary:
            q = self.queue
        else:
            q = self.queue2

        q.put((project, '', True))


    def queue_check_project(self, project, primary = True):
        if primary:
            q = self.queue
        else:
            q = self.queue2

        q.put((project, '', False))


    def queue_checkout_package_meta(self, project, package, primary = True):
        if primary:
            q = self.queue
        else:
            q = self.queue2

        q.put((project, package, True))


    def queue_checkout_package(self, project, package, primary = True):
        if primary:
            q = self.queue
        else:
            q = self.queue2

        q.put((project, package, False))


    def queue_checkout_packages(self, project, packages, primary = True):
        if primary:
            q = self.queue
        else:
            q = self.queue2

        for package in packages:
            q.put((project, package, False))


    def queue_checkout_project(self, project, parent = None, primary = True, force_simple_checkout = False, no_config = False):
        """ Queue a checkout of a project.

            If there's already a checkout for this project, instead of a full
            checkout, a check of what is locally on disk and what should be
            there will be done to only update what is necessary.

            force_simple_checkout is used when what is needed is really just a
            checkout of this project, and nothing else (no metadata for all
            packages, and no devel projects).

        """
        project_dir = os.path.join(self.dest_dir, project)

        # Check now whether the directory exists, since we might create it
        # while creating the project config
        exists = os.path.exists(project_dir)

        if not no_config:
            if parent:
                self._copy_project_config(project, parent)
            else:
                self._write_project_config(project)

        if exists and not force_simple_checkout:
            debug_thread('main', 'Queuing check for %s' % (project,))
            self.queue_check_project(project, primary)
        else:
            debug_thread('main', 'Queuing packages of %s' % (project,))
            (packages, error) = self._get_packages_in_project(project)

            if error is not None:
                print('Ignoring project %s: %s' % (project, error), file=sys.stderr)
                return

            self.queue_checkout_packages(project, packages, primary)

        if not force_simple_checkout:
            if (project not in self.conf.projects or
                not self.conf.projects[project].checkout_devel_projects):
                # the pkgmeta of the project is automatically downloaded when
                # looking for devel projects
                self.queue_pkgmeta_project(project, primary)
            else:
                self._queue_checkout_devel_projects(project, primary)


    def _queue_checkout_devel_projects(self, project, primary = True):
        self.checkout_project_pkgmeta(project)
        pkgmeta_file = os.path.join(self.dest_dir, project, '_pkgmeta')
        if not os.path.exists(pkgmeta_file):
            print('Ignoring devel projects for project %s: no packages metadata' % (project,), file=sys.stderr)
            return

        devel_projects = set()

        try:
            collection = ET.parse(pkgmeta_file).getroot()
            package = collection.find('package')
            if package == None:
                print('Project %s doesn\'t exist.' % (project,), file=sys.stderr)
                return

            for package in collection.findall('package'):
                devel = package.find('devel')
                # "not devel" won't work (probably checks if devel.text is
                # empty)
                if devel == None:
                    continue
                devel_project = devel.get('project')
                if devel_project and devel_project != project:
                    devel_projects.add(devel_project)

        except SyntaxError as e:
            print('Ignoring devel projects for project %s: %s' % (project, e), file=sys.stderr)
            return

        for devel_project in devel_projects:
            self.queue_checkout_project(devel_project, parent = project, primary = primary)


    def remove_checkout_package(self, project, package):
        """ Remove the checkout of a package. """
        path = os.path.join(self.dest_dir, project, package)
        if os.path.exists(path):
            shutil.rmtree(path)

    def remove_checkout_project(self, project):
        """ Remove the checkout of a project. """
        path = os.path.join(self.dest_dir, project)
        if os.path.exists(path):
            shutil.rmtree(path)
0707010000000B000081A40000000000000000000000016548EB8C00002E5A000000000000000000000000000000000000003300000000osc-plugin-collab-0.104+30/server/obs-db/config.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os
import sys

import configparser
import io as StringIO

from osc import conf as oscconf
from osc import oscerr

""" Example:
[General]
threads = 5

[Defaults]
branches = latest, fallback

[Project openSUSE:Factory]

[Project GNOME:STABLE:2.32]
branches = gnome-2-32
"""

#######################################################################


class ConfigException(Exception):
    pass


#######################################################################


class EasyConfigParser(configparser.SafeConfigParser):

    def safe_get(self, section, option, default):
        try:
            return self.get(section, option)
        except:
            return default


    def safe_getint(self, section, option, default):
        try:
            return self.getint(section, option)
        except:
            return default


    def safe_getboolean(self, section, option, default):
        try:
            return self.getboolean(section, option)
        except:
            return default


#######################################################################


class ConfigProject:

    default_checkout_devel_projects = False
    default_parent = ''
    default_branches = ''
    _default_branches_helper = []
    default_force_project_parent = False
    default_lenient_delta = False


    @classmethod
    def set_defaults(cls, cp, section):
        """ Set new default settings for projects. """
        cls.default_checkout_devel_projects = cp.safe_getboolean(section, 'checkout-devel-projects', cls.default_checkout_devel_projects)
        cls.default_parent = cp.safe_get(section, 'parent', cls.default_parent)
        cls._default_branches_helper = cp.safe_get(section, 'branches', cls._default_branches_helper)
        cls.default_force_project_parent = cp.safe_getboolean(section, 'force-project-parent', cls.default_force_project_parent)
        cls.default_lenient_delta = cp.safe_getboolean(section, 'lenient-delta', cls.default_lenient_delta)


    def __init__(self, cp, section, name):
        self.name = name

        self.checkout_devel_projects = cp.safe_getboolean(section, 'checkout-devel-projects', self.default_checkout_devel_projects)
        self.parent = cp.safe_get(section, 'parent', self.default_parent)
        self._branches_helper = cp.safe_get(section, 'branches', self._default_branches_helper)
        self.force_project_parent = cp.safe_getboolean(section, 'force-project-parent', self.default_force_project_parent)
        self.lenient_delta = cp.safe_getboolean(section, 'lenient-delta', self.default_lenient_delta)

        if self._branches_helper:
            self.branches = [ branch.strip() for branch in self._branches_helper.split(',') if branch ]


#######################################################################


class Config:

    def __init__(self, file = '', use_opensuse = False):
        """ Arguments:
            file -- configuration file to use

        """
        self.filename = file
        self.use_opensuse = use_opensuse
        self.apiurl = None
        self.hermes_baseurl = ''
        self.hermes_feeds = ''
        self._hermes_feeds_helper = []

        self.cache_dir = os.path.realpath('cache')
        self.ignore_conf_mtime = False
        self.no_full_check = False
        self.allow_project_catchup = False
        self.threads = 10
        self.sockettimeout = 30
        self.threads_sockettimeout = 30

        self.debug = False
        self.mirror_only_new = False
        self.force_hermes = False
        self.force_upstream = False
        self.force_db = False
        self.force_xml = False
        self.skip_hermes = False
        self.skip_mirror = False
        self.skip_upstream = False
        self.skip_db = False
        self.skip_xml = False

        self.projects = {}

        if use_opensuse:
            self._parse_opensuse()

        self._parse()

        # Workaround to remove warning coming from osc.conf when we don't use
        # SSL checks
        buffer = StringIO.StringIO()
        oldstderr = sys.stderr
        sys.stderr = buffer

        try:
            oscconf.get_config(override_apiurl = self.apiurl)
        except oscerr.NoConfigfile as e:
            sys.stderr = oldstderr
            buffer.close()
            raise ConfigException(e)
        except Exception as e:
            sys.stderr = oldstderr
            buffer.close()
            raise e

        # Workaround to remove warning coming from osc.conf when we don't use
        # SSL checks
        sys.stderr = oldstderr
        self._copy_stderr_without_ssl(buffer)
        buffer.close()

        # Make sure apiurl points to the right value
        self.apiurl = oscconf.config['apiurl']

        # M2Crypto and socket timeout are not friends. See
        # https://bugzilla.osafoundation.org/show_bug.cgi?id=2341
        if ('sslcertck' in oscconf.config['api_host_options'][self.apiurl] and
            oscconf.config['api_host_options'][self.apiurl]['sslcertck']):
            self.sockettimeout = 0

        # obviously has to be done after self.sockettimeout has been set to its
        # final value
        if self.threads_sockettimeout <= 0:
            self.threads_sockettimeout = self.sockettimeout


    def _copy_stderr_without_ssl(self, buffer):
        """ Copy the content of a string io to stderr, except for the SSL warning. """
        buffer.seek(0)
        ignore_empty = False
        while True:
            line = buffer.readline()
            if len(line) == 0:
                break
            if line == 'WARNING: SSL certificate checks disabled. Connection is insecure!\n':
                ignore_empty = True
                continue
            if line == '\n' and ignore_empty:
                ignore_empty = False
                continue
            ignore_empty = False
            print(line[:-1], file=sys.stderr)

    def _get_opensuse_conf_path(self):
        """ Return the path to the openSUSE configuration file. """
        return os.path.join(os.path.dirname(globals()['__file__']), 'data', 'opensuse.conf')

    def _parse_opensuse(self):
        """ Parse the openSUSE configuration file. """
        opensuse_conf = self._get_opensuse_conf_path()
        if os.path.exists(opensuse_conf):
            self._parse_file(opensuse_conf)
        else:
            raise ConfigException('openSUSE configuration file does not exist.')

    def _parse(self):
        """ Parse the configuration file. """
        if not self.filename:
            return

        if not os.path.exists(self.filename):
            raise ConfigException('Configuration file %s does not exist.' % self.filename)

        self._parse_file(self.filename)

    def _parse_file(self, filename):
        cp = EasyConfigParser()
        cp.read(filename)

        self._parse_general(cp)
        self._parse_debug(cp)
        self._parse_default_project(cp)
        self._parse_projects(cp)


    def _parse_general(self, cp):
        """ Parses the section about general settings. """
        if not cp.has_section('General'):
            return

        self.apiurl = cp.safe_get('General', 'apiurl', self.apiurl)
        self.hermes_baseurl = cp.safe_get('General', 'hermes-baseurl', self.hermes_baseurl)
        self._hermes_feeds_helper = cp.safe_get('General', 'hermes-feeds', self._hermes_feeds_helper)
        self.cache_dir = os.path.realpath(cp.safe_get('General', 'cache-dir', self.cache_dir))
        self.ignore_conf_mtime = cp.safe_getboolean('General', 'ignore-conf-mtime', self.ignore_conf_mtime)
        self.no_full_check = cp.safe_getboolean('General', 'no-full-check', self.no_full_check)
        self.allow_project_catchup = cp.safe_getboolean('General', 'allow-project-catchup', self.allow_project_catchup)
        self.threads = cp.safe_getint('General', 'threads', self.threads)
        self.sockettimeout = cp.safe_getint('General', 'sockettimeout', self.sockettimeout)
        self.threads_sockettimeout = cp.safe_getint('General', 'threads-sockettimeout', self.threads_sockettimeout)

        if self._hermes_feeds_helper:
            self.hermes_feeds = [ feed.strip() for feed in self._hermes_feeds_helper.split(',') ]


    def _parse_debug(self, cp):
        """ Parses the section about debug settings. """
        if not cp.has_section('Debug'):
            return

        self.debug = cp.safe_getboolean('Debug', 'debug', self.debug)
        self.mirror_only_new = cp.safe_getboolean('Debug', 'mirror-only-new', self.mirror_only_new)

        self.force_hermes = cp.safe_getboolean('Debug', 'force-hermes', self.force_hermes)
        self.force_upstream = cp.safe_getboolean('Debug', 'force-upstream', self.force_upstream)
        self.force_db = cp.safe_getboolean('Debug', 'force-db', self.force_db)
        self.force_xml = cp.safe_getboolean('Debug', 'force-xml', self.force_xml)

        self.skip_hermes = cp.safe_getboolean('Debug', 'skip-hermes', self.skip_hermes)
        self.skip_mirror = cp.safe_getboolean('Debug', 'skip-mirror', self.skip_mirror)
        self.skip_upstream = cp.safe_getboolean('Debug', 'skip-upstream', self.skip_upstream)
        self.skip_db = cp.safe_getboolean('Debug', 'skip-db', self.skip_db)
        self.skip_xml = cp.safe_getboolean('Debug', 'skip-xml', self.skip_xml)


    def _parse_default_project(self, cp):
        """ Parses the section about default settings for projects. """
        if not cp.has_section('Defaults'):
            return

        ConfigProject.set_defaults(cp, 'Defaults')


    def _parse_projects(self, cp):
        """ Parses the project sections. """
        for section in cp.sections():
            if not section.startswith('Project '):
                continue

            name = section[len('Project '):]
            if name in self.projects:
                raise ConfigException('More than one section for project %s in %s.' % (name, self.filename))

            project = ConfigProject(cp, section, name)
            self.projects[name] = project

    def get_opensuse_mtime(self):
        """ Return the mtime of the openSUSE configuration file. """
        stats = os.stat(self._get_opensuse_conf_path())
        return stats.st_mtime
0707010000000C000041ED0000000000000000000000026548EB8C00000000000000000000000000000000000000000000002E00000000osc-plugin-collab-0.104+30/server/obs-db/data0707010000000D000081A40000000000000000000000016548EB8C00001537000000000000000000000000000000000000003C00000000osc-plugin-collab-0.104+30/server/obs-db/data/defaults.conf####
# Configuration file.
# All the options are documented here.
####

[General]
####
# General settings, with default values
####
## API URL for the build service. Defaults to the default osc API URL.
# apiurl =
#
## Base URL for Hermes installation matching the build service instance used.
## If no base URL is provided, then more requests will have to be done to the
## server to update the data.
# hermes-baseurl =
#
## List of Hermes feeds id to monitor to know about the changes in the build
## service. Use a ',' to separate the different ids. If no id is provided, then
## more requests will have to be done to the server to update the data.
# hermes-feeds =
#
## Where to store all the data that will be created.
# cache-dir = ./cache
#
## Ignore the mtime changes for the configuration file. Usually, when the
## configuration file has a different mtime, we recheck everything. This might
## not be the intended behavior. Note that this does not apply to mtime changes
## to non-user configuration file (like opensuse.conf).
# ignore-conf-mtime = False
#
## Do not do any full check of the mirror checkout. This can be needed if you
## want to always manually handle the full check (instead of having this done
## automatically).
# no-full-check = False
#
## Allow the catchup mechanism to checkout full projects. This is disabled by
## default as this is quite expensive in terms of requests to the build
## service.
# allow-project-catchup = False
#
## Maximum number of threads to use. Set to 1 to disable threads.
# threads = 10
#
## Timeout for sockets (in seconds). Putting a long timeout can slow down
## things, especially as the build service sometimes keeps hanging connections
## without any reason. Use 0 to not change anything.
## Note: because of https://bugzilla.osafoundation.org/show_bug.cgi?id=2341 and
## the fact that osc uses M2Crypto, we can't set a low timeout without
## affecting security. To use this setting, you'll have to set sslcertck=0 for
## the appropriate build server in ~/.oscrc. Else, this setting will be
## ignored.
# sockettimeout = 30
#
## Timeout for sockets used by threads. For technical reasons, we can work
## around the above issue with M2Crypto for the threads checking out the files
## from the build service. Since the timeouts are most likely to happen there,
## having an easy to use workaround makes sense.
## Set to 0 to use sockettimeout.
# threads-sockettimeout = 30

[Debug]
####
# Debug settings, with default values
####
## Provide debug output.
# debug = False
#
## If the mirror step will check/checkout all projects, only process the ones
## that have no checkout at the moment. This is useful after changing the
## configuration to add new projects, if you want a fast update (instead of
## triggering a check for all projects).
# mirror-only-new = False
#
## If the mirror step would check/checkout all projects because of a
## configuration change, then make it use the hermes update. This also applies
## to the db step that would rebuild. This is useful after changing the
## configuration without changing projects, if you want a fast update (instead
## of triggering a check for all projects).
# force-hermes = False
#
## Force full rebuild of the upstream db.
# force-upstream = False
#
## Force full rebuild of the main db.
# force-db = False
#
## Force creation of the xml.
# force-xml = False
#
## Whether to pretend there's no change in hermes or not.
# skip-hermes = False
#
## Whether to skip the mirror step.
# skip-mirror = False
#
## Whether to skip the upstream db step.
# skip-upstream = False
#
## Whether to skip the main db step.
# skip-db = False
#
## Whether to skip the main xml step.
# skip-xml = False


[Defaults]
####
# Settings that will apply to all projects, unless overloaded
####
## Whether or not to also check out devel projects of this project. Note that
## this is not inherited by the devel projects.
# checkout-devel-projects = False
#
## Sets a default parent project, to know where packages should end. For
## example, the parent project of GNOME:Factory is openSUSE:Factory (even though
## not all packages in GNOME:Factory really exists in openSUSE:Factory). This is
## generally used to know when a link is not set to the right parent, or to
## compare packages to parent packages, even if they're not links.
# parent = 
#
## Which branches to use for upstream versions. Use a ',' to separate the
## different branches.
# branches = 
#
## Whether to ignore the project/package a link points to and always use the
## configured parent project of this project as parent for the packages?
## This is useful for projects that are kept in sync with copypac instead of
## linkpac (and when the devel project links to another parent project).
## For example: parent is openSUSE:Published, but package is
## openSUSE:Devel/test and links to openSUSE:11.1/test
# force-project-parent = False
#
## Whether to ignore changes in .changes, or useless changes in .spec, when
## comparing non-link packages to find a delta.
# lenient-delta = False


####
# To specify a project to analyze (like home:vuntz), create a new section named
# 'Project home:vuntz'. Settings will be inherited from the Defaults section,
# but can be overloaded.
# Note that those settings are inherited by devel projects that will be checked
# out via the checkout-devel-projects option.
# Example:
# [Project home:vuntz]
# lenient-delta = True
####

0707010000000E000081A40000000000000000000000016548EB8C00000828000000000000000000000000000000000000003C00000000osc-plugin-collab-0.104+30/server/obs-db/data/opensuse.conf###
# See defaults.conf for documentation.
###

[General]
# Will result in one feed fetched: https://hermes.opensuse.org/feeds/25545,25547,55386,55387,55388.rdf
hermes-baseurl = https://hermes.opensuse.org/
hermes-feeds = 25545, 25547, 55386, 55387, 55388

[Defaults]
# Warning: when changing this, you'll have to manually update all the
# _obs-db-options files in the cache too to activate the change (unless you
# start a full check of the cache).
branches = latest, cpan, pypi, fallback

[Project openSUSE:Factory]
parent = openSUSE:Factory
checkout-devel-projects = True

[Project GNOME:Factory]
branches = gnome-stable, gnome-stable-extras

# For Factory + 1
[Project GNOME:Next]
branches = gnome-unstable, gnome-unstable-extras

# For Leap 15.4 / SLE15SP4
[Project GNOME:STABLE:41]
branches = gnome-41, gnome-41-extras

# For Leap 15.2 / SLE15SP2
[Project GNOME:STABLE:3.34]
branches = gnome-3.34, gnome-3.34-extras

# For the OpenStack Cloud project
[Project Cloud:OpenStack:Master]
branches = pypi

# For devel:cloverleaf:testing
[Project devel:cloverleaf:testing]

## Discontinued
##
## For 11.1
# [Project GNOME:STABLE:2.26]
# branches = gnome-2.26
#
## For 11.1 and 11.2
#[Project GNOME:STABLE:2.28]
#branches = gnome-2.28
#
## For 11.2 and 11.3
#[Project GNOME:STABLE:2.30]
#branches = gnome-2.30
#
## For 11.3
#[Project GNOME:STABLE:2.32]
#branches = gnome-2.32
#
## Contrib disappeared in June 2012
#[Project openSUSE:Factory:Contrib]
#parent = openSUSE:Factory:Contrib
#
#[Project GNOME:Contrib]
#parent = openSUSE:Factory:Contrib
#
## For 11.4
#[Project GNOME:STABLE:3.0]
#branches = gnome-3.0
#
## For 11.4 (and 12.1?)
#[Project GNOME:STABLE:3.2]
#branches = gnome-3.2
#
## For 12.1
#[Project GNOME:STABLE:3.4]
#branches = gnome-3.4
#
## For 12.2
#[Project GNOME:STABLE:3.6]
#branches = gnome-3.6, gnome-3.6-extras
#
## For 12.3
#[Project GNOME:STABLE:3.8]
#branches = gnome-3.8, gnome-3.8-extras

## For 13.1
#[Project GNOME:STABLE:3.12]
#branches = gnome-3.12, gnome-3.12-extras

## For Leap 42.2
#[Project GNOME:STABLE:3.20]
#branches = gnome-3.20, gnome-3.20-extras

0707010000000F000081A40000000000000000000000016548EB8C00015A0F000000000000000000000000000000000000003500000000osc-plugin-collab-0.104+30/server/obs-db/database.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2008-2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os
import sys

import operator
import re
import sqlite3

try:
    from lxml import etree as ET
except ImportError:
    try:
        from xml.etree import cElementTree as ET
    except ImportError:
        import cElementTree as ET

import upstream
import util

# This script was originally written for an usage with autobuild. It has been
# adapted for the build service, but a few features might still need to be
# ported. TODO-BS or FIXME-BS might indicate them.

# Files to just ignore in the file list of a package.
IGNORE_FILES = [ 'ready', 'MD5SUMS', 'MD5SUMS.meta' ]

# Would be nice to get the list of failed package builds. In autobuild, look
# at /work/built/info/failed/ TODO-BS

# In autobuild, it's easy to get access to the rpmlint reports. Keep this empty
# until we find an easy way to do the same for the build service (maybe just
# parse the output of the build log?)
RPMLINT_ERRORS_PATH = ''
#RPMLINT_ERRORS_PATH = os.path.join(OBS_DISSECTOR_DIR, 'tmp', 'rpmlint')

# Changing this means breaking compatibility with previous db
DB_MAJOR = 4
# Changing this means changing the db while keeping compatibility
# Increase when changing the db. Reset to 0 when changing DB_MAJOR.
DB_MINOR = 0


#######################################################################

class ObsDbException(Exception):
    pass

#######################################################################

class Base:
    sql_table = 'undefined'
    sql_lastid = -1

    @classmethod
    def sql_setup(cls, cursor):
        pass

    def _sql_update_last_id(self, cursor):
        cursor.execute('''SELECT last_insert_rowid();''')
        self.sql_id = cursor.fetchone()[0]
        self.__class__.sql_lastid = self.sql_id

#######################################################################

class File(Base):
    sql_table = 'file'

    @classmethod
    def sql_setup(cls, cursor):
        cursor.execute('''CREATE TABLE %s (
            id INTEGER PRIMARY KEY,
            filename TEXT,
            mtime INTEGER,
            srcpackage INTEGER
            );''' % cls.sql_table)

    @classmethod
    def sql_get_all(cls, cursor, srcpackage):
        files = []

        cursor.execute('''SELECT * FROM %s WHERE
            srcpackage = ?
            ;''' % cls.sql_table,
            (srcpackage.sql_id,))

        for row in cursor.fetchall():
            file = File(srcpackage, row['filename'], row['mtime'])
            file.sql_id = row['id']
            files.append(file)

        return files

    @classmethod
    def sql_remove_all(cls, cursor, ids):
        if type(ids) == list:
            where = ' OR '.join([ 'srcpackage = ?' for i in range(len(ids)) ])
            cursor.execute('''DELETE FROM %s WHERE
                %s;''' % (cls.sql_table, where),
                ids)
        else:
            cursor.execute('''DELETE FROM %s WHERE
                srcpackage = ?
                ;''' % cls.sql_table,
                (ids,))

    def __init__(self, src, name, mtime):
        self.sql_id = -1

        self.filename = name
        self.src_package = src
        try:
            self.mtime = int(mtime)
        except SyntaxError as e:
            print('Cannot parse %s as mtime for %s/%s: %s' % (mtime, src, name, e), file=sys.stderr)
            self.mtime = -1

    def sql_add(self, cursor):
        if self.src_package.sql_id == -1:
            raise ObsDbException('No SQL id for %s when adding file %s.' % (self.src_package.name, self.filename))
        cursor.execute('''INSERT INTO %s VALUES (
            NULL, ?, ?, ?
            );''' % self.sql_table,
            (self.filename, self.mtime, self.src_package.sql_id))
        self._sql_update_last_id(cursor)

    def sql_update_from(self, cursor, new_file):
        if self.sql_id < 0:
            raise ObsDbException('File %s of %s used for update does not have a SQL id.' % (self.filename, self.src_package.name))
        cursor.execute('''UPDATE %s SET
            mtime = ?
            WHERE id = ?
            ;''' % self.sql_table,
            (new_file.mtime, self.sql_id))

    def sql_remove(self, cursor):
        if self.src_package.sql_id == -1:
            raise ObsDbException('No SQL id for %s when removing file %s.' % (self.src_package.name, self.filename))
        cursor.execute('''DELETE FROM %s WHERE
            filename = ? AND
            srcpackage = ?
            ;''' % self.sql_table,
            (self.filename, self.src_package.sql_id))

    def __ne__(self, other):
        if (self.filename != other.filename or
            self.mtime != other.mtime or
            self.src_package.name != other.src_package.name or
            self.src_package.project.name != other.src_package.project.name):
            return True
        return False

    def __eq__(self, other):
        return not self.__ne__(other)

#######################################################################

class Source(Base):
    sql_table = 'source'

    @classmethod
    def sql_setup(cls, cursor):
        cursor.execute('''CREATE TABLE %s (
            id INTEGER PRIMARY KEY,
            filename TEXT,
            srcpackage INTEGER,
            nb_in_pack INTEGER
            );''' % cls.sql_table)

    @classmethod
    def sql_get_all(cls, cursor, srcpackage):
        sources = []

        cursor.execute('''SELECT * FROM %s WHERE
            srcpackage = ?
            ;''' % cls.sql_table,
            (srcpackage.sql_id,))

        for row in cursor.fetchall():
            source = Source(srcpackage, row['filename'], row['nb_in_pack'])
            source.sql_id = row['id']
            sources.append(source)

        return sources

    @classmethod
    def sql_remove_all(cls, cursor, ids):
        if type(ids) == list:
            where = ' OR '.join([ 'srcpackage = ?' for i in range(len(ids)) ])
            cursor.execute('''DELETE FROM %s WHERE
                %s;''' % (cls.sql_table, where),
                ids)
        else:
            cursor.execute('''DELETE FROM %s WHERE
                srcpackage = ?
                ;''' % cls.sql_table,
                (ids,))

    def __init__(self, src, name, i):
        self.sql_id = -1

        self.filename = name
        self.src_package = src
        self.number = i

    def sql_add(self, cursor):
        if self.src_package.sql_id == -1:
            raise ObsDbException('No SQL id for %s when adding source %s.' % (self.src_package.name, self.filename))
        cursor.execute('''INSERT INTO %s VALUES (
            NULL, ?, ?, ?
            );''' % self.sql_table,
            (self.filename, self.src_package.sql_id, self.number))
        self._sql_update_last_id(cursor)

    def sql_update_from(self, cursor, new_source):
        if self.sql_id < 0:
            raise ObsDbException('Source %s of %s used for update does not have a SQL id.' % (self.filename, self.src_package.name))
        cursor.execute('''UPDATE %s SET
            nb_in_pack = ?
            WHERE id = ?
            ;''' % self.sql_table,
            (new_source.number, self.sql_id))

    def sql_remove(self, cursor):
        if self.src_package.sql_id == -1:
            raise ObsDbException('No SQL id for %s when removing source %s.' % (self.src_package.name, self.filename))
        cursor.execute('''DELETE FROM %s WHERE
            filename = ? AND
            srcpackage = ? AND
            nb_in_pack = ?
            ;''' % self.sql_table,
            (self.filename, self.src_package.sql_id, self.number))

    def __ne__(self, other):
        if (self.filename != other.filename or
            self.number != other.number or
            self.src_package.name != other.src_package.name or
            self.src_package.project.name != other.src_package.project.name):
            return True
        return False

    def __eq__(self, other):
        return not self.__ne__(other)

#######################################################################

class Patch(Base):
    sql_table = 'patch'

    # Format of tag is: "# PATCH-{FIX|FEATURE}-{OPENSUSE|SLED|UPSTREAM} name-of-file.patch bncb.novell.com_bug_number bgob.gnome.org_bug_number [email protected] -- this patch..."
    # PATCH-NEEDS-REBASE is also a known tag
    # We remove trailing ':' for tags too...
    re_strip_comment = re.compile('^#[#\s]*([\S]*[^:\s]):?\s*(.*)$', re.UNICODE)
    # anything that looks like something.diff or something.patch
    re_get_filename = re.compile('^\s*(\S+\.(?:diff|patch))\s*(.*)$')
    # anything that looks like word123 or word#123
    re_get_bug_number = re.compile('^\s*([a-zA-Z]+)#?(\d+)\s*(.*)$')
    # anything that looks like a@a
    re_get_email = re.compile('^\s*(\S+@\S+)\s*(.*)$')
    # remove "--" if it's leading the string
    re_get_short_descr = re.compile('^\s*(?:--\s*)?(.*)$')

    @classmethod
    def sql_setup(cls, cursor):
        cursor.execute('''CREATE TABLE %s (
            id INTEGER PRIMARY KEY,
            filename TEXT,
            srcpackage INTEGER,
            nb_in_pack INTEGER,
            apply_order INTEGER,
            disabled INTEGER,
            tag TEXT,
            tag_filename TEXT,
            short_descr TEXT,
            descr TEXT,
            bnc INTEGER,
            bgo INTEGER,
            bmo INTEGER,
            bln INTEGER,
            brc INTEGER,
            fate INTEGER,
            cve INTEGER
            );''' % cls.sql_table)

    @classmethod
    def sql_get_all(cls, cursor, srcpackage):
        patches = []

        cursor.execute('''SELECT * FROM %s WHERE
            srcpackage = ?
            ;''' % cls.sql_table,
            (srcpackage.sql_id,))

        for row in cursor.fetchall():
            patch = Patch(srcpackage, row['filename'], row['nb_in_pack'], row['disabled'])
            patch.sql_id = row['id']
            patch.apply_order = row['apply_order']
            patch.tag = row['tag']
            patch.tag_filename = row['tag_filename']
            patch.bnc = row['bnc']
            patch.bgo = row['bgo']
            patch.bmo = row['bmo']
            patch.bln = row['bln']
            patch.brc = row['brc']
            patch.fate = row['fate']
            patch.cve = row['cve']
            patch.short_descr = row['short_descr']
            patch.descr = row['descr']
            patches.append(patch)

        return patches

    @classmethod
    def sql_remove_all(cls, cursor, ids):
        if type(ids) == list:
            where = ' OR '.join([ 'srcpackage = ?' for i in range(len(ids)) ])
            cursor.execute('''DELETE FROM %s WHERE
                %s;''' % (cls.sql_table, where),
                ids)
        else:
            cursor.execute('''DELETE FROM %s WHERE
                srcpackage = ?
                ;''' % cls.sql_table,
                (ids,))

    def __init__(self, src, name, i, disabled=True):
        self.sql_id = -1

        self.filename = name
        self.number = i
        self.apply_order = -1
        if disabled:
            self.disabled = 1
        else:
            self.disabled = 0
        self.src_package = src
        self.tag = ''
        self.tag_filename = ''
        self.bnc = 0
        self.bgo = 0
        self.bmo = 0
        self.bln = 0
        self.brc = 0
        self.fate = 0
        self.cve = 0
        self.short_descr = ''
#FIXME read the header from the patch itself
        self.descr = ''

    def set_tag(self, tag_line):
        match = Patch.re_strip_comment.match(tag_line)
        if not match:
            return
        self.tag = match.group(1)
        buf = match.group(2)

        match = Patch.re_get_filename.match(buf)
        if match:
            self.tag_filename = match.group(1)
            buf = match.group(2)

        while True:
            match = Patch.re_get_bug_number.match(buf)
            if not match:
                break

            buf = match.group(3)

            if match.group(1) == 'bnc':
                self.bnc = int(match.group(2))
            elif match.group(1) == 'bgo':
                self.bgo = int(match.group(2))
            elif match.group(1) == 'bmo':
                self.bmo = int(match.group(2))
            elif match.group(1) == 'bln':
                self.bln = int(match.group(2))
            elif match.group(1) == 'brc':
                self.brc = int(match.group(2))
            elif match.group(1) == 'fate':
                self.fate = int(match.group(2))
            elif match.group(1) == 'cve':
                self.cve = int(match.group(2))

        match = Patch.re_get_email.match(buf)
        if match:
#FIXME what to do with match.group(1)
            buf = match.group(2)

        match = Patch.re_get_short_descr.match(buf)
        if match:
            self.short_descr = match.group(1)
        else:
            print('Weird error with patch tag analysis on %s: ' % tag_line, file=sys.stderr)
            self.short_descr = buf

    def set_apply_order(self, order):
        self.apply_order = order

    def set_disabled(self, disabled):
        if disabled:
            self.disabled = 1
        else:
            self.disabled = 0

    def sql_add(self, cursor):
        if self.src_package.sql_id == -1:
            raise ObsDbException('No SQL id for %s when adding patch %s.' % (self.src_package.name, self.filename))
        cursor.execute('''INSERT INTO %s VALUES (
            NULL, ?, ?, ?, ?, ?,
            ?, ?, ?, ?,
            ?, ?, ?, ?, ?, ?, ?
            );''' % self.sql_table,
            (self.filename, self.src_package.sql_id, self.number, self.apply_order, self.disabled,
             self.tag, self.tag_filename, self.short_descr, self.descr,
             self.bnc, self.bgo, self.bmo, self.bln, self.brc, self.fate, self.cve))
        self._sql_update_last_id(cursor)

    def sql_update_from(self, cursor, new_patch):
        if self.sql_id < 0:
            raise ObsDbException('Patch %s of %s used for update does not have a SQL id.' % (self.filename, self.src_package.name))
        cursor.execute('''UPDATE %s SET
            nb_in_pack = ?,
            apply_order = ?,
            disabled = ?,
            tag = ?,
            tag_filename = ?,
            short_descr = ?,
            descr = ?,
            bnc = ?,
            bgo = ?,
            bmo = ?,
            bln = ?,
            brc = ?,
            fate = ?,
            cve = ?
            WHERE id = ?
            ;''' % self.sql_table,
            (new_patch.number, new_patch.apply_order, new_patch.disabled,
             new_patch.tag, new_patch.tag_filename, new_patch.short_descr, new_patch.descr,
             new_patch.bnc, new_patch.bgo, new_patch.bmo, new_patch.bln, new_patch.brc, new_patch.fate, new_patch.cve,
             self.sql_id))

    def sql_remove(self, cursor):
        if self.src_package.sql_id == -1:
            raise ObsDbException('No SQL id for %s when removing patch %s.' % (self.src_package.name, self.filename))
        cursor.execute('''DELETE FROM %s WHERE
            filename = ? AND
            srcpackage = ? AND
            nb_in_pack = ?
            ;''' % self.sql_table,
            (self.filename, self.src_package.sql_id, self.number))

    def __ne__(self, other):
        if (self.filename != other.filename or
            self.number != other.number or
            self.apply_order != other.apply_order or
            self.disabled != other.disabled or
            self.tag != other.tag or
            self.tag_filename != other.tag_filename or
            self.bnc != other.bnc or
            self.bgo != other.bgo or
            self.bmo != other.bmo or
            self.bln != other.bln or
            self.brc != other.brc or
            self.fate != other.fate or
            self.cve != other.cve or
            self.short_descr != other.short_descr or
            self.descr != other.descr or
            self.src_package.name != other.src_package.name or
            self.src_package.project.name != other.src_package.project.name):
            return True
        return False

    def __eq__(self, other):
        return not self.__ne__(other)

#######################################################################

class RpmlintReport(Base):
    sql_table = 'rpmlint'
    re_rpmlint = re.compile('\s*(.+):\s+(.):\s+(\S+)\s+(\S*)(?:\s+.*)?')
    re_rpmlint_summary = re.compile('\s*\d+\s+packages\s+and\s+\d+\s+spec\s*files\s+checked\s*;')

    @classmethod
    def sql_setup(cls, cursor):
        cursor.execute('''CREATE TABLE %s (
            id INTEGER PRIMARY KEY,
            srcpackage INTEGER,
            level TEXT,
            type TEXT,
            detail TEXT,
            descr TEXT
            );''' % cls.sql_table)

    @classmethod
    def sql_get_all(cls, cursor, srcpackage):
        rpmlints = []

        cursor.execute('''SELECT * FROM %s WHERE
            srcpackage = ?
            ;''' % cls.sql_table,
            (srcpackage.sql_id,))

        for row in cursor.fetchall():
            rpmlint = RpmlintReport(srcpackage, row['level'], row['type'], row['detail'])
            rpmlint.sql_id = row['id']
            rpmlint.descr = row['descr']
            rpmlints.append(rpmlint)

        return rpmlints

    @classmethod
    def sql_remove_all(cls, cursor, ids):
        if type(ids) == list:
            where = ' OR '.join([ 'srcpackage = ?' for i in range(len(ids)) ])
            cursor.execute('''DELETE FROM %s WHERE
                %s;''' % (cls.sql_table, where),
                ids)
        else:
            cursor.execute('''DELETE FROM %s WHERE
                srcpackage = ?
                ;''' % cls.sql_table,
                (ids,))

    @classmethod
    def analyze(cls, srcpackage, filepath):
        rpmlints = []

        file = open(filepath)

        # read everything until we see the rpmlint report header
        while True:
            line = file.readline()
            if line == '':
                break

            # this is not the beginning of the header
            if line[:-1] != 'RPMLINT report:':
                continue

            # we've found the beginning of the header, so let's read the whole
            # header
            line = file.readline()
            # note: we remove spurious spaces because the build service sometimes add some
            if line[:-1].replace(' ', '') != '===============':
                # oops, this is not what we expected, so go back.
                file.seek(-len(line), os.SEEK_CUR)

            break

        rpmlints_without_descr = []
        descr = None
        separator = True

        # now let's analyze the real important lines
        while True:
            line = file.readline()
            if line == '':
                break

            # empty line: this is either the separator between two series of
            # entries of the same type, or just an empty line.
            # in the former case, this means we'll be able to set the
            # description of the former series and save the series; we just
            # need to be sure we're starting a new series
            if line[:-1] == '':
                separator = True
                continue

            # let's see if this is the end of the rpmlint report, and stop
            # reading if this is the case
            match = cls.re_rpmlint_summary.match(line)
            if match:
                break

            # is this a new entry?
            match = cls.re_rpmlint.match(line)
            if match:
                # we had an old series, so save it
                if separator:
                    if len(rpmlints_without_descr) > 0:
                        for rpmlint in rpmlints_without_descr:
                            rpmlint.descr = descr
                        rpmlints.extend(rpmlints_without_descr)
                        # reset state
                        rpmlints_without_descr = []
                        descr = None
                    separator = False

                package = match.group(1)
                src = package.find('.src:')
                if src > 0:
                    line = package.rstrip()[src + len('.src:'):]
                    try:
                        line = int(line)
                    except:
                        print('Cannot parse source package line in rpmlint line from %s (%s): %s' % (srcpackage.name, srcpackage.project.name, package), file=sys.stderr)
                        line = None
                else:
                    line = None

                level = match.group(2)
                type = match.group(3)
                detail = match.group(4).strip()
                if line != None:
                    if detail == '':
                        detail = 'line %d' % line
                    else:
                        detail = detail + ' (line %d)' % line

                rpmlints_without_descr.append(RpmlintReport(srcpackage, level, type, detail))
                continue

            # this is not a new entry and not an empty line, so this is the
            # description for the past few rpmlint entries. This is only
            # expected if we had some entries before
            if len(rpmlints_without_descr) == 0:
                print('Unexpected rpmlint line from %s (%s): %s' % (srcpackage.name, srcpackage.project.name, line[:-1]), file=sys.stderr)
                continue

            if descr:
                descr = descr + ' ' + line[:-1]
            else:
                descr = line[:-1]


        if len(rpmlints_without_descr) > 0:
            rpmlints.extend(rpmlints_without_descr)

        file.close()

        return rpmlints

    def __init__(self, src_package, level, type, detail):
        self.sql_id = -1

        self.src_package = src_package
        self.level = level
        self.type = type
        self.detail = detail
        self.descr = None

    def sql_add(self, cursor):
        if self.src_package.sql_id == -1:
            raise ObsDbException('No SQL id for %s when adding rpmlint report.' % (self.src_package.name,))
        cursor.execute('''INSERT INTO %s VALUES (
            NULL, ?,
            ?, ?, ?, ?
            );''' % self.sql_table,
            (self.src_package.sql_id,
             self.level, self.type, self.detail, self.descr))
        self._sql_update_last_id(cursor)

    def sql_update_from(self, cursor, new_report):
        raise ObsDbException('Rpmlint reports cannot be updated since they do not change with time (they get added or removed).')

    def sql_remove(self, cursor):
        if self.src_package.sql_id == -1:
            raise ObsDbException('No SQL id for %s when removing rpmlint report.' % (self.src_package.name,))
        cursor.execute('''DELETE FROM %s WHERE
            srcpackage = ? AND
            level = ? AND
            type = ? AND
            detail = ? AND
            descr = ?
            ;''' % self.sql_table,
            (self.src_package.sql_id, self.level, self.type, self.detail, self.descr))

    def __ne__(self, other):
        if (self.level != other.level or
            self.type != other.type or
            self.detail != other.detail or
            self.descr != other.descr or
            self.src_package.name != other.src_package.name or
            self.src_package.project.name != other.src_package.project.name):
            return True
        return False

    def __eq__(self, other):
        return not self.__ne__(other)

#######################################################################

class Package(Base):
    sql_table = 'package'

    @classmethod
    def sql_setup(cls, cursor):
        cursor.execute('''CREATE TABLE %s (
            id INTEGER PRIMARY KEY,
            name TEXT,
            srcpackage INTEGER,
            summary TEXT,
            description TEXT
            );''' % cls.sql_table)

    @classmethod
    def sql_get_all(cls, cursor, srcpackage):
        packages = []

        cursor.execute('''SELECT * FROM %s WHERE
            srcpackage = ?
            ;''' % cls.sql_table,
            (srcpackage.sql_id,))

        for row in cursor.fetchall():
            package = Package(srcpackage, row['name'])
            package.sql_id = row['id']
            package.summary = row['summary']
            package.description = row['description']
            packages.append(package)

        return packages

    @classmethod
    def sql_remove_all(cls, cursor, ids):
        if type(ids) == list:
            where = ' OR '.join([ 'srcpackage = ?' for i in range(len(ids)) ])
            cursor.execute('''DELETE FROM %s WHERE
                %s;''' % (cls.sql_table, where),
                ids)
        else:
            cursor.execute('''DELETE FROM %s WHERE
                srcpackage = ?
                ;''' % cls.sql_table,
                (ids,))

    def __init__(self, src, name):
        self.sql_id = -1

        self.name = name
        self.src_package = src
        self.summary = ''
#FIXME we don't parse the descriptions right now
        self.description = ''

    def sql_add(self, cursor):
        if self.src_package.sql_id == -1:
            raise ObsDbException('No SQL id for %s when adding package %s.' % (self.src_package.name, self.name))
        cursor.execute('''INSERT INTO %s VALUES (
            NULL, ?, ?,
            ?, ?
            );''' % self.sql_table,
            (self.name, self.src_package.sql_id,
             self.summary, self.description))
        self._sql_update_last_id(cursor)

    def sql_update_from(self, cursor, new_package):
        if self.sql_id < 0:
            raise ObsDbException('Package %s of %s used for update does not have a SQL id.' % (self.name, self.src_package.name))
        cursor.execute('''UPDATE %s SET
            summary = ?,
            description = ?
            WHERE id = ?
            ;''' % self.sql_table,
            (new_package.summary, new_package.description, self.sql_id))

    def sql_remove(self, cursor):
        if self.src_package.sql_id == -1:
            raise ObsDbException('No SQL id for %s when removing package %s.' % (self.src_package.name, self.name))
        cursor.execute('''DELETE FROM %s WHERE
            name = ? AND
            srcpackage = ?
            ;''' % self.sql_table,
            (self.name, self.src_package.sql_id))

    def set_summary(self, summary):
        # make sure we have utf-8 for sqlite3, else we get
        # sqlite3.ProgrammingError
        try:
            self.summary = summary.encode('utf8')
        except UnicodeDecodeError:
            # we couldn't convert to utf-8: it's likely because we had latin1
            self.summary = summary.decode('latin1')

    def set_description(self, description):
        # see comments in set_summary()
        try:
            self.description = description.encode('utf8')
        except UnicodeDecodeError:
            self.description = description.decode('latin1')

    def __ne__(self, other):
        if (self.name != other.name or
            self.summary != other.summary or
            self.description != other.description or
            self.src_package.name != other.src_package.name or
            self.src_package.project.name != other.src_package.project.name):
            return True
        return False

    def __eq__(self, other):
        return not self.__ne__(other)

#######################################################################

class SrcPackage(Base):
    sql_table = 'srcpackage'

    re_spec_define = re.compile('^%define\s+(\S*)\s+(\S*)', re.IGNORECASE)
    re_spec_name = re.compile('^Name:\s*(\S*)', re.IGNORECASE)
    re_spec_version = re.compile('^Version:\s*(\S*)', re.IGNORECASE)
    re_spec_summary = re.compile('^Summary:\s*(.*)', re.IGNORECASE)
    re_spec_source = re.compile('^Source(\d*):\s*(\S*)', re.IGNORECASE)
    re_spec_patch = re.compile('^((?:#[#\s]*)?)Patch(\d*):\s*(\S*)', re.IGNORECASE)
    re_spec_package = re.compile('^%package\s*(\S.*)', re.IGNORECASE)
    re_spec_package2 = re.compile('^-n\s*(\S*)', re.IGNORECASE)
    re_spec_lang_package = re.compile('^%lang_package', re.IGNORECASE)
    re_spec_prep = re.compile('^%prep', re.IGNORECASE)
    re_spec_build = re.compile('^%build', re.IGNORECASE)
    re_spec_apply_patch = re.compile('^((?:#[#\s]*)?)%patch(\d*)', re.IGNORECASE)

    @classmethod
    def sql_setup(cls, cursor):
        cursor.execute('''CREATE TABLE %s (
            id INTEGER PRIMARY KEY,
            name TEXT,
            project INTEGER,
            srcmd5 TEXT,
            version TEXT,
            link_project TEXT,
            link_package TEXT,
            devel_project TEXT,
            devel_package TEXT,
            upstream_name TEXT,
            upstream_version TEXT,
            upstream_url TEXT,
            is_obs_link INTEGER,
            obs_link_has_delta INTEGER,
            obs_error TEXT,
            obs_error_details TEXT
            );''' % cls.sql_table)

    def _sql_fill(self, cursor):
        self.files = File.sql_get_all(cursor, self)
        self.sources = Source.sql_get_all(cursor, self)
        self.patches = Patch.sql_get_all(cursor, self)
        self.rpmlint_reports = RpmlintReport.sql_get_all(cursor, self)
        self.packages = Package.sql_get_all(cursor, self)

    @classmethod
    def _sql_get_from_row(cls, cursor, project, row, recursive = False):
        pkg_object = SrcPackage(row['name'], project)
        pkg_object.sql_id = row['id']
        pkg_object.project = project
        pkg_object.srcmd5 = row['srcmd5']
        pkg_object.version = row['version']
        pkg_object.link_project = row['link_project']
        pkg_object.link_package = row['link_package']
        pkg_object.devel_project = row['devel_project']
        pkg_object.devel_package = row['devel_package']
        pkg_object.upstream_name = row['upstream_name']
        pkg_object.upstream_version = row['upstream_version']
        pkg_object.upstream_url = row['upstream_url']
        pkg_object.is_link = row['is_obs_link'] != 0
        pkg_object.has_delta = row['obs_link_has_delta'] != 0
        pkg_object.error = row['obs_error']
        pkg_object.error_details = row['obs_error_details']

        if recursive:
            pkg_object._sql_fill(cursor)

        return pkg_object

    @classmethod
    def sql_get(cls, cursor, project, name, recursive = False):
        cursor.execute('''SELECT * FROM %s WHERE
            name = ? AND
            project = ?
            ;''' % cls.sql_table,
            (name, project.sql_id))

        rows = cursor.fetchall()
        length = len(rows)

        if length == 0:
            return None
        elif length > 1:
            raise ObsDbException('More than one source package named %s for project %s in database.' % (name, project.name))

        return cls._sql_get_from_row(cursor, project, rows[0], recursive)

    @classmethod
    def sql_get_all(cls, cursor, project, recursive = False):
        srcpackages = []

        cursor.execute('''SELECT * FROM %s WHERE
            project = ?
            ;''' % cls.sql_table,
            (project.sql_id,))

        for row in cursor.fetchall():
            srcpackage = cls._sql_get_from_row(cursor, project, row, False)
            srcpackages.append(srcpackage)

        if recursive:
            # we do a second loop so we can use only one cursor, that shouldn't
            # matter much since the loop is not the slow part
            for srcpackage in srcpackages:
                srcpackage._sql_fill(cursor)

        return srcpackages

    @classmethod
    def sql_remove_all(cls, cursor, project_ids):
        if type(project_ids) == list:
            where = ' OR '.join([ 'project = ?' for i in range(len(project_ids)) ])
            cursor.execute('''SELECT id FROM %s WHERE
                %s;''' % (cls.sql_table, where),
                project_ids)
        else:
            cursor.execute('''SELECT id FROM %s WHERE
                project = ?
                ;''' % cls.sql_table,
                (project_ids,))

        ids = [ id for (id,) in cursor.fetchall() ]
        if not ids:
            return

        Package.sql_remove_all(cursor, ids)
        RpmlintReport.sql_remove_all(cursor, ids)
        Source.sql_remove_all(cursor, ids)
        Patch.sql_remove_all(cursor, ids)
        File.sql_remove_all(cursor, ids)

        if type(project_ids) == list:
            where = ' OR '.join([ 'project = ?' for i in range(len(project_ids)) ])
            cursor.execute('''DELETE FROM %s WHERE
                %s;''' % (cls.sql_table, where),
                project_ids)
        else:
            cursor.execute('''DELETE FROM %s WHERE
                project = ?
                ;''' % cls.sql_table,
                (project_ids,))

    @classmethod
    def sql_simple_remove(cls, cursor, project, package):
        cursor.execute('''SELECT A.id FROM %s as A, %s as B WHERE
            A.project = B.id AND
            B.name = ? AND
            A.name = ?
            ;''' % (cls.sql_table, Project.sql_table),
            (project, package))

        ids = [ id for (id,) in cursor.fetchall() ]
        if not ids:
            return

        Package.sql_remove_all(cursor, ids)
        RpmlintReport.sql_remove_all(cursor, ids)
        Source.sql_remove_all(cursor, ids)
        Patch.sql_remove_all(cursor, ids)
        File.sql_remove_all(cursor, ids)

        where = ' OR '.join([ 'id = ?' for i in range(len(ids)) ])
        cursor.execute('''DELETE FROM %s WHERE
            %s;''' % (cls.sql_table, where),
            ids)

    def __init__(self, name, project):
        self.sql_id = -1

        self.name = name
        self.project = project
        self.srcmd5 = ''
        self.version = ''

        self.upstream_name = ''
        self.upstream_version = ''
        self.upstream_url = ''

        self.packages = []
        self.sources = []
        self.patches = []
        self.files = []
        self.rpmlint_reports = []

        self.link_project = ''
        self.link_package = ''
        self.devel_project = ''
        self.devel_package = ''

        # not booleans, since sqlite doesn't support this
        self.is_link = 0
        # 1 means link delta, 2 means delta but without link so a human being
        # has to look on how to synchronize this
        self.has_delta = 0
        self.error = ''
        self.error_details = ''

        # the package is a link using the branch mechanism
        self.has_branch = 0
        # there's a local _meta file for this package
        self.has_meta = False

        self._ready_for_sql = False

    def sql_add(self, cursor):
        if not self._ready_for_sql:
            raise ObsDbException('Source package %s is a shim object, not to be put in database.' % (self.name,))

        if self.project.sql_id == -1:
            raise ObsDbException('No SQL id for %s when adding source package %s.' % (self.project.name, self.name))
        cursor.execute('''INSERT INTO %s VALUES (
            NULL,
            ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?
            );''' % self.sql_table,
            (self.name, self.project.sql_id, self.srcmd5, self.version, self.link_project, self.link_package, self.devel_project, self.devel_package, self.upstream_name, self.upstream_version, self.upstream_url, self.is_link, self.has_delta, self.error, self.error_details))
        self._sql_update_last_id(cursor)

        for package in self.packages:
            package.sql_add(cursor)

        for rpmlint in self.rpmlint_reports:
            rpmlint.sql_add(cursor)

        for source in self.sources:
            source.sql_add(cursor)

        for patch in self.patches:
            patch.sql_add(cursor)

        for file in self.files:
            file.sql_add(cursor)

    def sql_update_from(self, cursor, new_srcpackage):
        if not new_srcpackage._ready_for_sql:
            raise ObsDbException('Source package %s used for update is a shim object, not to be put in database.' % (new_srcpackage.name,))
        if self.sql_id < 0:
            raise ObsDbException('Source package %s used for update does not have a SQL id.' % (self.name,))

        # might be needed by objects like files that we'll add to the database
        # if they were not present before
        new_srcpackage.sql_id = self.sql_id

        # we obviously don't need to update the id, the name or the project
        cursor.execute('''UPDATE %s SET
            srcmd5 = ?,
            version = ?,
            link_project = ?,
            link_package = ?,
            devel_project = ?,
            devel_package = ?,
            upstream_name = ?,
            upstream_version = ?,
            upstream_url = ?,
            is_obs_link = ?,
            obs_link_has_delta = ?,
            obs_error = ?,
            obs_error_details = ?
            WHERE id = ?
            ;''' % self.sql_table,
            (new_srcpackage.srcmd5, new_srcpackage.version, new_srcpackage.link_project, new_srcpackage.link_package, new_srcpackage.devel_project, new_srcpackage.devel_package, new_srcpackage.upstream_name, new_srcpackage.upstream_version, new_srcpackage.upstream_url, new_srcpackage.is_link, new_srcpackage.has_delta, new_srcpackage.error, new_srcpackage.error_details, self.sql_id))

        def pop_first(list):
            try:
                return list.pop(0)
            except IndexError:
                return None

        def update_list(cursor, oldlist, newlist, attr):
            """ Generic function to update list of objects like files, patches, etc.

                This requires that the lists are sortable by an attribute
                (attr) and that __ne__ and sql_update_from methods exists for
                the objects.

            """
            oldlist.sort(key=operator.attrgetter(attr))
            newlist.sort(key=operator.attrgetter(attr))
            # copy the new list to not edit it
            copylist = list(newlist)
            newitem = pop_first(copylist)
            for olditem in oldlist:
                if not newitem:
                    olditem.sql_remove(cursor)
                    continue
                oldattr = getattr(olditem, attr)
                newattr = getattr(newitem, attr)
                if oldattr < newattr:
                    olditem.sql_remove(cursor)
                else:
                    if oldattr > newattr:
                        while newitem and oldattr > newattr:
                            newitem.sql_add(cursor)
                            newitem = pop_first(copylist)
                            if newitem:
                                newattr = getattr(newitem, attr)

                    # not an 'else' since we do another loop above that
                    # can change newattr
                    if oldattr == newattr:
                        if olditem != newitem:
                            olditem.sql_update_from(cursor, newitem)
                        newitem = pop_first(copylist)

            # add remaining items
            while newitem:
                newitem.sql_add(cursor)
                newitem = pop_first(copylist)

        update_list(cursor, self.packages, new_srcpackage.packages, 'name')
        update_list(cursor, self.sources,  new_srcpackage.sources,  'filename')
        update_list(cursor, self.patches,  new_srcpackage.patches,  'filename')
        update_list(cursor, self.files,    new_srcpackage.files,    'filename')

        # Rpmlint warnings can only get added/removed, not updated
        for rpmlint in self.rpmlint_reports:
            if not rpmlint in new_srcpackage.rpmlint_reports:
                rpmlint.sql_remove(cursor)
        for rpmlint in new_srcpackage.rpmlint_reports:
            if not rpmlint in self.rpmlint_reports:
                rpmlint.sql_add(cursor)

    def sql_remove(self, cursor):
        if self.project.sql_id == -1:
            raise ObsDbException('No SQL id for %s when removing source package %s.' % (self.project.name, self.name))

        if self.sql_id == -1:
            cursor.execute('''SELECT id FROM %s WHERE
                name = ? AND
                project = ?
                ;''' % self.sql_table,
                (self.name, self.project.sql_id))
            self.sql_id = cursor.fetchone()[0]

        Package.sql_remove_all(cursor, self.sql_id)
        RpmlintReport.sql_remove_all(cursor, self.sql_id)
        Source.sql_remove_all(cursor, self.sql_id)
        Patch.sql_remove_all(cursor, self.sql_id)
        File.sql_remove_all(cursor, self.sql_id)

        cursor.execute('''DELETE FROM %s WHERE
            id = ?
            ;''' % self.sql_table,
            (self.sql_id,))

    def read_from_disk(self, project_directory, upstream_db):
        srcpackage_dir = os.path.join(project_directory, self.name)

        self._analyze_files(srcpackage_dir)
        self._analyze_specs(srcpackage_dir)
        self._analyze_meta(srcpackage_dir)
        self._get_rpmlint_errors()

        if upstream_db and self.project.branches:
            (self.upstream_name, self.upstream_version, self.upstream_url) = upstream_db.get_upstream_data(self.project.branches, self.name)

        if self.project.parent and self.project.parent != self.project.name and not self.is_link and not self.error:
            self.error = 'not-link'

        self._ready_for_sql = True

    def _analyze_files(self, srcpackage_dir):
        linkfile = os.path.join(srcpackage_dir, '_link')
        if os.path.exists(linkfile):
            self.is_link = 1

            try:
                root = ET.parse(linkfile).getroot()
            except SyntaxError as e:
                print('Cannot parse %s: %s' % (linkfile, e), file=sys.stderr)
            else:
                node = root.find('patches')
                if node is not None:
                    if node.find('delete') != None or node.find('apply') != None:
                        self.has_delta = 1
                    if node.find('branch') != None:
                        self.has_branch = 1

        root = None
        files = os.path.join(srcpackage_dir, '_files-expanded')
        if not os.path.exists(files):
            files = os.path.join(srcpackage_dir, '_files')
        if os.path.exists(files):
            try:
                root = ET.parse(files).getroot()
            except SyntaxError as e:
                print('Cannot parse %s: %s' % (files, e), file=sys.stderr)
            else:
                self.srcmd5 = root.get('srcmd5')
                linkinfo = root.find('linkinfo')
                if linkinfo != None:
                    link_project = linkinfo.get('project')
                    if link_project:
                        self.link_project = link_project
                    link_package = linkinfo.get('package')
                    if link_package:
                        self.link_package = link_package

                    error = linkinfo.get('error')
                    if error:
                        if error.find('does not exist in project') != -1:
                            self.error = 'not-in-parent'
                        elif error.find('could not apply patch') != -1:
                            self.error = 'need-merge-with-parent'
                        elif error.find('conflict in file') != -1:
                            self.error = 'need-merge-with-parent'
                        else:
                            self.error = 'unknown-error'
                        self.error_details = error

                    if self.error:
                        self.has_delta = 1

                for node in root.findall('entry'):
                    filename = node.get('name')
                    if filename in IGNORE_FILES:
                        continue
                    mtime = node.get('mtime')
                    self.files.append(File(self, filename, mtime))

        # if we want to force the parent to the project parent, then we do it
        # only if the package is a link and there's no error in the link
        # package
        if self.project.force_project_parent and self.is_link and not self.error and (self.link_project not in [ self.project.parent, self.project.name ] or self.link_package != self.name):
            self.is_link = 0
            self.has_delta = 0
            self.link_project = None
            self.link_package = None

        if not self.is_link and root is not None:
            self._compare_raw_files_with_parent(srcpackage_dir, root)

    def _compare_raw_files_with_parent(self, srcpackage_dir, root):
        '''
            Compare the content of two source packages by looking at the
            present files, and their md5sum.
        '''
        if root is None:
            return
        if not self.project.parent or self.project.parent == self.project.name:
            return

        parent_package_dir = os.path.join(srcpackage_dir, '..', '..', self.project.parent, self.name)
        files = os.path.join(parent_package_dir, '_files-expanded')
        if not os.path.exists(files):
            files = os.path.join(parent_package_dir, '_files')

        if not os.path.exists(files):
            return

        try:
            parent_root = ET.parse(files).getroot()
        except SyntaxError as e:
            print('Cannot parse %s: %s' % (files, e), file=sys.stderr)
            return

        parent_files = {}
        for node in parent_root.findall('entry'):
            filename = node.get('name')
            if filename in IGNORE_FILES:
                continue
            md5 = node.get('md5')
            parent_files[filename] = md5

        for node in root.findall('entry'):
            filename = node.get('name')
            if filename in IGNORE_FILES:
                continue
            md5 = node.get('md5')
            if filename not in parent_files:
                self.has_delta = 2
                break
            elif md5 != parent_files[filename]:
                if self.project.lenient_delta:
                    # we don't really care about .changes here
                    if filename[-8:] == '.changes':
                        continue
                    # for spec files, we try to ignore the irrelevant stuff
                    elif filename[-5:] == '.spec':
                        spec = os.path.join(srcpackage_dir, filename)
                        parent_spec = os.path.join(parent_package_dir, filename)
                        if self._specs_are_different_lenient(spec, parent_spec):
                            self.has_delta = 2
                            break
                else:
                    self.has_delta = 2
                    break
            del parent_files[filename]

    def _specs_are_different_lenient(self, spec_a, spec_b):
        '''
            Compare two spec files, but ignore some useless changes:
             - ignore space changes
             - ignore blank lines
             - ignore comments
             - ignore Release tag
             - ignore %changelog
        '''

        def strip_useless_spaces(s):
            return ' '.join(s.split())

        def get_next_line(file):
            while True:
                line = file.readline()
                if len(line) == 0:
                    return None
                line = line[:-1]
                line = strip_useless_spaces(line)
                if not line:
                    continue
                if line[0] == '#':
                    continue
                if line.startswith('Release:'):
                    continue
                if line == '%changelog':
                    return None
                return line

        if not os.path.exists(spec_a) or not os.path.exists(spec_b):
            return True

        file_a = open(spec_a)
        file_b = open(spec_b)

        diff = False

        while True:
            line_a = get_next_line(file_a)
            line_b = get_next_line(file_b)
            if line_a is None:
                if line_b is not None:
                    diff = True
                break
            if line_b is None:
                diff = True
                break
            if line_a != line_b:
                diff = True
                break

        file_a.close()
        file_b.close()

        return diff

    def _analyze_specs(self, srcpackage_dir):
        # If there's an error, then nothing to do: the package is broken anyway
        if self.is_link and self.error:
            return

        # Only look at one spec file, since the build service works this way.
        # By default, we take the spec file with the same name as the source
        # package; if it doesn't exist, we take the first one.
        bestfile = None
        specname = self.name + '.spec'

        def _name_is_perfect_match(specname, filename):
            # if the file has a prefix, it has to be '_service:.*' to be
            # considered as perfect candidate
            return filename == specname or (filename.startswith('_service:') and filename.endswith(':' + specname))

        for file in self.files:
            if file.filename[-5:] == '.spec':
                if _name_is_perfect_match(specname, file.filename):
                    if not bestfile or bestfile.mtime < file.mtime:
                        bestfile = file
                else:
                    if not bestfile:
                        bestfile = file
                    elif not _name_is_perfect_match(specname, bestfile.filename) and bestfile.mtime < file.mtime:
                        # the current best file has not a perfect name, so we
                        # just take the best one based on the mtime
                        bestfile = file

        if bestfile:
            self._analyze_spec(os.path.join(srcpackage_dir, bestfile.filename))

    def _analyze_spec(self, filename):
        '''Analyze a spec file and extract the relevant data from there'''
        if not os.path.exists(filename):
            print('Spec file %s of %s/%s does not exist' % (os.path.basename(filename), self.project.name, self.name), file=sys.stderr)
            return

        spec = open(filename)

        current_package = None
        defines = {}
        defines['name'] = self.name

        def subst_defines(s, defines):
            '''Replace macros like %{version} and %{name} in strings. Useful
               for sources and patches '''
            for key in list(defines.keys()):
                if s.find(key) != -1:
                    value = defines[key]
                    s = s.replace('%%{%s}' % key, value)
                    s = s.replace('%%%s' % key, value)
            return s

        # to help if Summary is defined before Name
        early_summary = False

        line = 'empty'
        while True:
            # we need to remember the previous line for patch tags
#FIXME: some packages have comments on two lines...
            previous_line = line
            line = spec.readline()
            if line == '':
                break

            match = SrcPackage.re_spec_prep.match(line)
            if match:
                break

            match = SrcPackage.re_spec_define.match(line)
            if match:
                value = subst_defines(match.group(2), defines)
                defines[match.group(1)] = value
                continue

            match = SrcPackage.re_spec_name.match(line)
            if match:
                name = match.group(1)
                defines['name'] = name
                current_package = Package(self, match.group(1))
                if early_summary:
                    # if we had a summary before the name, then use it now
                    current_package.set_summary(early_summary)
                    early_summary = None
                self.packages.append(current_package)
                continue

            match = SrcPackage.re_spec_lang_package.match(line)
            if match:
                current_package = Package(self, defines['name'] + '-lang')
                self.packages.append(current_package)
                continue

            match = SrcPackage.re_spec_package.match(line)
            if match:
                pack_line = subst_defines(match.group(1), defines)
                match = SrcPackage.re_spec_package2.match(pack_line)
                if match:
                    current_package = Package(self, match.group(1))
                else:
                    current_package = Package(self, defines['name'] + '-' + pack_line)
                self.packages.append(current_package)
                continue

            match = SrcPackage.re_spec_version.match(line)
            if match:
                # Ignore version if it's redefined for a second package.
                # Test case: MozillaThunderbird.spec, where the main package
                # has a version, and the enigmail subpackage has another
                # version.
                if self.version and len(self.packages) > 1:
                    continue

                self.version = subst_defines(match.group(1), defines)
                defines['version'] = self.version
                continue

            match = SrcPackage.re_spec_summary.match(line)
            if match:
                if not current_package:
                    # save the summary for later
                    early_summary = match.group(1)
                    continue
                current_package.set_summary(match.group(1))
                continue

            match = SrcPackage.re_spec_source.match(line)
            if match:
                if match.group(1) == '':
                    nb = '0'
                else:
                    nb = match.group(1)
                buf = subst_defines(match.group(2), defines)
                source = Source(self, buf, nb)
                self.sources.append(source)
                continue

            match = SrcPackage.re_spec_patch.match(line)
            if match:
                # we don't need it here: we'll explicitly mark the patches as
                # applied later
                disabled = (match.group(1) != '')
                if match.group(2) == '':
                    nb = '0'
                else:
                    nb = match.group(2)
                buf = subst_defines(match.group(3), defines)
                patch = Patch(self, buf, nb)
                patch.set_tag(previous_line)
                self.patches.append(patch)
                continue

        order = 0
        while True:
            line = spec.readline()
            if line == '':
                break

            match = SrcPackage.re_spec_build.match(line)
            if match:
                break

            match = SrcPackage.re_spec_apply_patch.match(line)
            if match:
                disabled = (match.group(1) != '')
                if match.group(2) == '':
                    nb = '0'
                else:
                    nb = match.group(2)
                for patch in self.patches:
                    if patch.number == nb:
                        patch.set_disabled(disabled)
                        patch.set_apply_order(order)
                        break
                order = order + 1
                continue

        spec.close()

    def _analyze_meta(self, srcpackage_dir):
        meta_file = os.path.join(srcpackage_dir, '_meta')
        if not os.path.exists(meta_file):
            return

        try:
            package = ET.parse(meta_file).getroot()
        except SyntaxError as e:
            print('Cannot parse %s: %s' % (meta_file, e), file=sys.stderr)
            return

        self.has_meta = True

        devel = package.find('devel')
        # "not devel" won't work (probably checks if devel.text is empty)
        if devel == None:
            return

        self.devel_project = devel.get('project', '')
        if not self.devel_project:
            return
        self.devel_package = devel.get('package', '')

    def _get_rpmlint_errors(self):
        if not RPMLINT_ERRORS_PATH or RPMLINT_ERRORS_PATH == '':
            return

        filepath = os.path.join(os.sep, RPMLINT_ERRORS_PATH, self.project.name, self.name + '.log')
        if not os.path.exists(filepath):
            return

        self.rpmlint_reports = RpmlintReport.analyze(self, filepath)

#######################################################################

class Project(Base):
    sql_table = 'project'

    @classmethod
    def sql_setup(cls, cursor):
        cursor.execute('''CREATE TABLE %s (
            id INTEGER PRIMARY KEY,
            name TEXT,
            parent TEXT,
            ignore_upstream INTEGER
            );''' % cls.sql_table)

    @classmethod
    def _sql_get_from_row(cls, cursor, row):
        prj_object = Project(row['name'])
        prj_object.sql_id = row['id']
        prj_object.parent = row['parent']
        prj_object.ignore_upstream = row['ignore_upstream'] != 0

        return prj_object

    @classmethod
    def sql_get(cls, cursor, name, recursive = False):
        cursor.execute('''SELECT * FROM %s WHERE
            name = ?
            ;''' % cls.sql_table,
            (name,))

        rows = cursor.fetchall()
        length = len(rows)

        if length == 0:
            return None
        elif length > 1:
            raise ObsDbException('More than one project named %s in database.' % name)

        row = rows[0]

        prj_object = cls._sql_get_from_row(cursor, row)

        if recursive:
            prj_object.srcpackages = SrcPackage.sql_get_all(cursor, prj_object, recursive)

        return prj_object

    @classmethod
    def sql_get_all(cls, cursor, recursive = False):
        projects = []

        cursor.execute('''SELECT * FROM %s;''' % cls.sql_table)

        for row in cursor.fetchall():
            project = cls._sql_get_from_row(cursor, row)
            projects.append(project)

        if recursive:
            # we do a second loop so we can use only one cursor, that shouldn't
            # matter much since the loop is not the slow part
            prj_object.srcpackages = SrcPackage.sql_get_all(cursor, prj_object, recursive)

        return projects

    @classmethod
    def sql_simple_remove(cls, cursor, project):
        cursor.execute('''SELECT id FROM %s WHERE
            name = ?
            ;''' % cls.sql_table,
            (project,))

        ids = [ id for (id,) in cursor.fetchall() ]
        if not ids:
            return

        SrcPackage.sql_remove_all(cursor, ids)

        where = ' OR '.join([ 'id = ?' for i in range(len(ids)) ])
        cursor.execute('''DELETE FROM %s WHERE
            %s;''' % (cls.sql_table, where),
            ids)

    def __init__(self, name):
        self.sql_id = -1

        self.name = name
        self.srcpackages = []

        # Various options set for this project
        self.parent = ''
        self.branches = []
        # Should we ignore the project/package a link points to and always use
        # the configured parent project of this project as parent for the
        # packages?
        # This is useful for projects that are kept in sync with copypac
        # instead of linkpac (and when the devel project links to another
        # parent project). Eg: parent is openSUSE:Published, but package is
        # openSUSE:Devel/test and links to openSUSE:11.1/test
        self.force_project_parent = False
        # When comparing non-link packages to find a delta, should we ignore
        # changes in .changes or useless changes in .spec?
        self.lenient_delta = False

        self._ready_for_sql = False

    def sql_add(self, cursor):
        if not self._ready_for_sql:
            raise ObsDbException('Project %s is a shim object, not to be put in database.' % (self.name,))

        cursor.execute('''INSERT INTO %s VALUES (
            NULL, ?, ?, ?
            );''' % self.sql_table,
            (self.name, self.parent, not self.branches))
        self._sql_update_last_id(cursor)

        for srcpackage in self.srcpackages:
            srcpackage.sql_add(cursor)

    def sql_remove(self, cursor):
        if self.sql_id == -1:
            cursor.execute('''SELECT id FROM %s WHERE
                name = ?
                ;''' % self.sql_table,
                (self.name,))
            self.sql_id = cursor.fetchone()[0]

        SrcPackage.sql_remove_all(cursor, self.sql_id)

        cursor.execute('''DELETE FROM %s WHERE
            id = ?
            ;''' % self.sql_table,
            (self.sql_id,))

    def _sync_config(self, projects_config, override_project_name = None):
        """
            When override_project_name is not None, then it means we are using
            the parent configuration.

        """
        if not projects_config:
            return False

        name = override_project_name or self.name

        if name not in projects_config:
            if not override_project_name and self.parent:
                return self._sync_config(projects_config, override_project_name = self.parent)

            return False

        project_config = projects_config[name]

        if not override_project_name and project_config.parent != self.name:
            self.parent = project_config.parent
        self.branches = project_config.branches
        self.force_project_parent = project_config.force_project_parent
        self.lenient_delta = project_config.lenient_delta

        return True

    def read_config(self, projects_config, parent_directory):
        """ Gets the config option for this project, saved in the _obs-db-options file. """
        # We first try to get the project configuration from the global
        # configuration
        if self._sync_config(projects_config):
            return

        # We failed, so let's use the special configuration cache
        config_file = os.path.join(parent_directory, self.name, '_obs-db-options')

        if not os.path.exists(config_file):
            return

        file = open(config_file)
        lines = file.readlines()
        file.close()

        for line in lines:
            line = line[:-1].strip()

            if not line or line.startswith('#'):
                continue

            elif line.startswith('parent='):
                parent = line[len('parent='):]
                if parent == self.name:
                    parent = ''
                self.parent = parent

            elif line.startswith('branches='):
                branches = line[len('branches='):]
                if not branches:
                    self.branches = []
                    continue
                self.branches = [ branch for branch in branches.split(',') if branch ]

            elif line.startswith('force-project-parent='):
                force_project_parent = line[len('force-project-parent='):]
                self.force_project_parent = force_project_parent.lower() in [ '1', 'true' ]

            elif line.startswith('lenient-delta='):
                lenient_delta = line[len('lenient-delta='):]
                self.lenient_delta = lenient_delta.lower() in [ '1', 'true' ]

            else:
                raise ObsDbException('Unknown project config option for %s: %s' % (self.name, line))

    def get_meta(self, parent_directory, package_name):
        """ Get the devel package for a specific package. """
        meta_file = os.path.join(parent_directory, self.name, '_pkgmeta')
        if not os.path.exists(meta_file):
            return ('', '')

        try:
            collection = ET.parse(meta_file).getroot()
        except SyntaxError as e:
            print('Cannot parse %s: %s' % (meta_file, e), file=sys.stderr)
            return ('', '')

        for package in collection.findall('package'):
            name = package.get('name')
            if name != package_name:
                continue

            devel = package.find('devel')
            # "not devel" won't work (probably checks if devel.text is empty)
            if devel == None:
                return ('', '')

            devel_project = devel.get('project', '')
            if not devel_project:
                return ('', '')
            devel_package = devel.get('package', '')

            return (devel_project, devel_package)

        return ('', '')

    def _read_meta(self, project_dir):
        meta_devel = {}

        meta_file = os.path.join(project_dir, '_pkgmeta')
        if not os.path.exists(meta_file):
            return meta_devel

        try:
            collection = ET.parse(meta_file).getroot()
        except SyntaxError as e:
            print('Cannot parse %s: %s' % (meta_file, e), file=sys.stderr)
            return meta_devel

        for package in collection.findall('package'):
            name = package.get('name')
            if not name:
                continue

            devel = package.find('devel')
            # "not devel" won't work (probably checks if devel.text is empty)
            if devel == None:
                continue

            devel_project = devel.get('project', '')
            if not devel_project:
                continue
            devel_package = devel.get('package', '')

            meta_devel[name] = (devel_project, devel_package)

        return meta_devel

    def read_from_disk(self, parent_directory, upstream_db):
        """
            Note: read_config() has to be called before.

        """
        project_dir = os.path.join(parent_directory, self.name)
        if not os.path.exists(project_dir):
            return

        meta_devel = self._read_meta(project_dir)

        for file in os.listdir(project_dir):
            if file in ['_pkgmeta']:
                continue

            if not os.path.isdir(os.path.join(project_dir, file)):
                continue

            srcpackage = SrcPackage(file, self)
            srcpackage.read_from_disk(project_dir, upstream_db)

            if not srcpackage.has_meta and srcpackage.name in meta_devel:
                (srcpackage.devel_project, srcpackage.devel_package) = meta_devel[srcpackage.name]

            self.srcpackages.append(srcpackage)

        self._ready_for_sql = True

#######################################################################

class ObsDb:

    def __init__(self, conf, db_dir, mirror_dir, upstream):
        self.conf = conf
        self.db_dir = db_dir
        self.mirror_dir = mirror_dir
        self.upstream = upstream

        self._filename = os.path.join(self.db_dir, 'obs.db')
        self._dbconn = None
        self._cursor = None

    def _debug_print(self, s):
        """ Print s if debug is enabled. """
        if self.conf.debug:
            print('ObsDb: %s' % s)

    def __del__(self):
        # needed for the commit
        self._close_db()

    def get_cursor(self):
        """ Return a cursor to the database. """
        self._open_existing_db_if_necessary()
        return self._dbconn.cursor()

    def exists(self):
        """ Return True if a database already exists. """
        if not os.path.exists(self._filename):
            return False

        try:
            self._open_existing_db_if_necessary()

            # make sure we have the same version of the format, else it's
            # better to start from scratch
            self._cursor.execute('''SELECT major, minor FROM db_version;''')
            (major, minor) = self._cursor.fetchone()
            if major != DB_MAJOR or minor != DB_MINOR:
                return False

            # just check there are some projects there, to be sure it's valid
            self._cursor.execute('''SELECT id FROM %s;''' % Project.sql_table)
            if len(self._cursor.fetchall()) <= 0:
                return False
        except:
            return False

        return True

    def _open_db(self, filename):
        """ Open a database file, and sets up everything. """
        if self._dbconn:
            self._close_db()
        self._dbconn = sqlite3.connect(filename)
        self._dbconn.row_factory = sqlite3.Row
        self._dbconn.text_factory = sqlite3.OptimizedUnicode
        self._cursor = self._dbconn.cursor()

    def _close_db(self):
        """ Closes the currently open database. """
        if self._cursor:
            self._cursor.close()
            self._cursor = None
        if self._dbconn:
            self._dbconn.commit()
            self._dbconn.close()
            self._dbconn = None

    def _open_existing_db_if_necessary(self):
        """ Opens the database if it's not already opened. """
        if self._dbconn:
            return
        if not os.path.exists(self._filename):
            raise ObsDbException('Database file %s does not exist.' % self._filename)
        self._open_db(self._filename)

    def _create_tables(self):
        self._cursor.execute('''CREATE TABLE db_version (
            major INTEGER,
            minor INTEGER
            );''')
        self._cursor.execute('''INSERT INTO db_version VALUES (
            ?, ?
            );''', (DB_MAJOR, DB_MINOR))

        Project.sql_setup(self._cursor)
        SrcPackage.sql_setup(self._cursor)
        Package.sql_setup(self._cursor)
        Source.sql_setup(self._cursor)
        Patch.sql_setup(self._cursor)
        File.sql_setup(self._cursor)
        RpmlintReport.sql_setup(self._cursor)

        self._dbconn.commit()

    def rebuild(self):
        """ Rebuild the database from scratch. """
        # We rebuild in a temporary file in case there's a bug in the script :-)
        tmpfilename = self._filename + '.new'
        if os.path.exists(tmpfilename):
            os.unlink(tmpfilename)

        util.safe_mkdir_p(self.db_dir)

        self._debug_print('Rebuilding the database')

        try:
            self._open_db(tmpfilename)
            self._create_tables()

            for file in os.listdir(self.mirror_dir):
                if not os.path.isdir(os.path.join(self.mirror_dir, file)):
                    continue
                self.add_project(file)

            self._close_db()
            os.rename(tmpfilename, self._filename)
        except Exception as e:
            if os.path.exists(tmpfilename):
                os.unlink(tmpfilename)
            raise e

    def add_project(self, project):
        """ Add data of all packages from project in the database. """
        self._open_existing_db_if_necessary()

        self._debug_print('Adding project %s' % project)

        prj_object = Project(project)
        prj_object.read_config(self.conf.projects, self.mirror_dir)
        prj_object.read_from_disk(self.mirror_dir, self.upstream)

        prj_object.sql_add(self._cursor)
        # It's apparently not needed to commit each time to keep a low-memory
        # profile, and committing is slowing things down.
        # self._dbconn.commit()

    def update_project(self, project):
        """ Update data of all packages from project in the database. """
        self._open_existing_db_if_necessary()

        # It's simpler to just remove all packages and add them again
        self.remove_project(project)
        self.add_project(project)

    def remove_project(self, project):
        """ Remove the project from the database. """
        self._open_existing_db_if_necessary()

        self._debug_print('Removing project %s' % project)

        Project.sql_simple_remove(self._cursor, project)

    def _add_package_internal(self, prj_object, package):
        """ Internal helper to add a package. """
        self._debug_print('Adding %s/%s' % (prj_object.name, package))

        project_dir = os.path.join(self.mirror_dir, prj_object.name)
        srcpackage_dir = os.path.join(project_dir, package)
        if not os.path.exists(srcpackage_dir):
            print('Added package %s in %s does not exist in mirror.' % (package, prj_object.name), file=sys.stderr)
            return

        pkg_object = SrcPackage(package, prj_object)
        pkg_object.read_from_disk(project_dir, self.upstream)
        if not pkg_object.has_meta:
            # In theory, this shouldn't be needed since added packages
            # should have a _meta file. Since it's unlikely to happen, it's
            # okay to parse a big project-wide file.
            self._debug_print('No meta during addition of %s/%s' % (prj_object.name, package))
            (pkg_object.devel_project, pkg_object.devel_package) = prj_object.get_meta(self.mirror_dir, package)

        pkg_object.sql_add(self._cursor)

        # Make sure we also have the devel project if we're interested in that
        if pkg_object.has_meta and pkg_object.devel_project and prj_object.name in self.conf.projects and self.conf.projects[prj_object.name].checkout_devel_projects:
            devel_prj_object = Project.sql_get(self._cursor, pkg_object.devel_project)
            if not devel_prj_object:
                self.add_project(pkg_object.devel_project)

    def _update_package_internal(self, prj_object, package, oldpkg_object):
        """ Internal helper to update a package. """
        self._debug_print('Updating %s/%s' % (prj_object.name, package))

        project_dir = os.path.join(self.mirror_dir, prj_object.name)
        srcpackage_dir = os.path.join(project_dir, package)
        if not os.path.exists(srcpackage_dir):
            print('Updated package %s in %s does not exist in mirror.' % (package, prj_object.name), file=sys.stderr)
            return

        update_children = False

        pkg_object = SrcPackage(package, prj_object)
        pkg_object.read_from_disk(project_dir, self.upstream)
        if not pkg_object.has_meta:
            # If the metadata was updated, we should have a _meta file for the
            # package. If this is not the case, then the metadata was not
            # updated, and then it's okay to keep the old metadata (instead of
            # parsing a big project-wide file).
            pkg_object.devel_project = oldpkg_object.devel_project
            pkg_object.devel_package = oldpkg_object.devel_package
        else:
            if (pkg_object.devel_project != oldpkg_object.devel_project or
                pkg_object.devel_package != oldpkg_object.devel_package):
                update_children = True

        oldpkg_object.sql_update_from(self._cursor, pkg_object)

        # If the devel package has changed, then "children" packages might have
        # a different error now. See _not_real_devel_package().
        if update_children:
            self._cursor.execute('''SELECT A.name, B.name
                                    FROM %s AS A, %s AS B
                                    WHERE B.project = A.id AND B.link_project = ? AND (B.link_package = ? OR B.name = ?)
                                    ;''' % (Project.sql_table, SrcPackage.sql_table),
                                    (prj_object.name, package, package))
            children = [ (child_project, child_package) for (child_project, child_package) in self._cursor ]
            for (child_project, child_package) in children:
                self.update_package(child_project, child_package)

        # Make sure we also have the devel project if we're interested in that
        if pkg_object.has_meta and pkg_object.devel_project and prj_object.name in self.conf.projects and self.conf.projects[prj_object.name].checkout_devel_projects:
            self._debug_print('Looking at meta during update of %s/%s' % (prj_object.name, package))
            devel_prj_object = Project.sql_get(self._cursor, pkg_object.devel_project)
            if not devel_prj_object:
                self.add_project(pkg_object.devel_project)

    def add_package(self, project, package):
        """ Add the package data in the database from the mirror. """
        self._open_existing_db_if_necessary()

        self._debug_print('Trying to add/update %s/%s' % (project, package))

        prj_object = Project.sql_get(self._cursor, project)
        if not prj_object:
            self.add_project(project)
            return

        prj_object.read_config(self.conf.projects, self.mirror_dir)

        pkg_object = SrcPackage.sql_get(self._cursor, prj_object, package, True)
        if pkg_object:
            self._update_package_internal(prj_object, package, pkg_object)
        else:
            self._add_package_internal(prj_object, package)

    def update_package(self, project, package):
        """ Update the package data in the database from the mirror. """
        # We actually share the code to be more robust
        self.add_package(project, package)

    def remove_package(self, project, package):
        """ Remove the package from the database. """
        self._open_existing_db_if_necessary()

        self._debug_print('Removing %s/%s' % (project, package))

        SrcPackage.sql_simple_remove(self._cursor, project, package)

    def get_devel_projects(self, project):
        """ Return the list of devel projects used by packages in project. """
        self._open_existing_db_if_necessary()

        self._cursor.execute('''SELECT A.devel_project FROM %s as A, %s AS B
                                WHERE A.project = B.id AND B.name = ?
                                GROUP BY devel_project
                                ;''' % (SrcPackage.sql_table, Project.sql_table),
                                (project,))
        return [ devel_project for (devel_project,) in self._cursor.fetchall() if devel_project ]

    def get_projects(self):
        """ Return the list of projects in the database. """
        self._open_existing_db_if_necessary()

        self._cursor.execute('''SELECT name FROM %s;''' % Project.sql_table)
        return [ name for (name,) in self._cursor.fetchall() ]

    def upstream_changes(self, upstream_mtime):
        """ Updates the upstream data that has changed since last time.
        
            Return a list of projects that have been updated.
        
        """
        branches = self.upstream.get_changed_packages(upstream_mtime)

        if not branches:
            return []

        self._open_existing_db_if_necessary()

        # Get all projects, with their config, and update the necessary
        # packages if needed
        projects = Project.sql_get_all(self._cursor, recursive = False)
        for project in projects:
            project.read_config(self.conf.projects, self.mirror_dir)

        updated_projects = set()

        for project in projects:
            for branch in list(branches.keys()):
                if branch != upstream.MATCH_CHANGE_NAME and not branch in project.branches:
                    continue

                branches_before = []
                if branch in project.branches:
                    branches_before = project.branches[:project.branches.index(branch)]

                self._cursor.execute('''SELECT name FROM %s WHERE project = ?;''' % SrcPackage.sql_table, (project.sql_id,))
                srcpackages = [ name for (name,) in self._cursor ]

                # so we're only interested in the intersection of the two sets
                # (in the project, and in the changed entries)
                affected_srcpackages = set(branches[branch]).intersection(srcpackages)

                if not affected_srcpackages:
                    continue

                updated_projects.add(project.name)

                self._debug_print('Upstream changes: %s -- %s' % (project.name, affected_srcpackages))

                for srcpackage in affected_srcpackages:
                    if self.upstream.exists_in_branches(branches_before, srcpackage):
                        continue

                    (upstream_name, upstream_version, upstream_url) = self.upstream.get_upstream_data(project.branches, srcpackage)
                    self._cursor.execute('''UPDATE %s SET
                            upstream_name = ?, upstream_version = ?, upstream_url = ?
                            WHERE name = ? AND project = ?;''' % SrcPackage.sql_table,
                            (upstream_name, upstream_version, upstream_url, srcpackage, project.sql_id))

        return list(updated_projects)

    def get_packages_with_upstream_change(self, upstream_mtime):
        """ Get the list of packages that are affected by upstream changes.

            Return a list of projects, each containing a list of packages, each
            one containing a tuple (upstream_version, upstream_url).

        """
        branches = self.upstream.get_changed_packages(upstream_mtime)

        if not branches:
            return {}

        self._open_existing_db_if_necessary()

        # Get all projects, with their config, and update the necessary
        # packages if needed
        projects = Project.sql_get_all(self._cursor, recursive = False)
        for project in projects:
            project.read_config(self.conf.projects, self.mirror_dir)

        result = {}

        for project in projects:
            for branch in list(branches.keys()):
                if branch != upstream.MATCH_CHANGE_NAME and not branch in project.branches:
                    continue

                branches_before = []
                if branch in project.branches:
                    branches_before = project.branches[:project.branches.index(branch)]

                self._cursor.execute('''SELECT name FROM %s WHERE project = ?;''' % SrcPackage.sql_table, (project.sql_id,))
                srcpackages = [ name for (name,) in self._cursor ]

                # so we're only interested in the intersection of the two sets
                # (in the project, and in the changed entries)
                affected_srcpackages = set(branches[branch]).intersection(srcpackages)

                if not affected_srcpackages:
                    continue

                if project.name not in result:
                    result[project.name] = {}

                self._debug_print('Upstream changes: %s -- %s' % (project.name, affected_srcpackages))

                for srcpackage in affected_srcpackages:
                    if self.upstream.exists_in_branches(branches_before, srcpackage):
                        continue

                    (upstream_name, upstream_version, upstream_url) = self.upstream.get_upstream_data(project.branches, srcpackage)
                    result[project.name][srcpackage] = (upstream_version, upstream_url)

        return result

    def post_analyze(self):
        """
            Do some post-commit analysis on the db, to find new errors now that
            we have all the data.
        """
        self._open_existing_db_if_necessary()

        self._debug_print('Post analysis')

        def _not_link_and_not_in_parent(devel_package_cache, cursor_helper, row):
            """
                Check if this is not a link and if it doesn't exist in the
                potential parent. In that case, the error is that maybe it
                should exist there
            """
            # Note: if the package was changed in any way, we won't have
            # the 'not-link-not-in-parent' error (since it's added only here).
            # So if we have it, it means the package hasn't been updated and is
            # therefore still a link. But the parent might have been created in
            # the meantime, so it's possible to go back to 'not-link'.

            if row['obs_error'] not in [ 'not-link', 'not-link-not-in-parent' ]:
                return False

            project_parent = row['project_parent']
            if not project_parent:
                return False

            try:
                devel_package_cache[project_parent][row['name']]
                error = 'not-link'
            except KeyError:
                error = 'not-link-not-in-parent'

            if row['obs_error'] != error:
                details = ''
                cursor_helper.execute('''UPDATE %s SET obs_error = ?, obs_error_details = ? WHERE id = ?;''' % SrcPackage.sql_table, (error, details, row['id']))
                return True

            return False

        def _not_real_devel_package(devel_package_cache, cursor_helper, row):
            """
                Look if the link package should really exist there (ie, is it
                the devel package of the parent?)
            """
            # Note: the errors created here can disappear when the devel
            # package of the link package changes, without the current package
            # changing. This is handled in _update_package_internal().

            # the errors here are not relevant to toplevel projects (ie,
            # projects without a parent)
            if row['project_parent'] == '':
                return False

            link_project = row['link_project']
            link_package = row['link_package'] or row['name']

            # internal link inside a project (to build another spec file)
            if link_project == row['project']:
                return False

            try:
                (devel_project, devel_package) = devel_package_cache[link_project][link_package]
                if devel_project != row['project'] or devel_package != row['name']:
                    if devel_project:
                        error = 'not-real-devel'
                        details = 'development project is %s' % devel_project
                    else:
                        error = 'parent-without-devel'
                        details = ''
                    cursor_helper.execute('''UPDATE %s SET obs_error = ?, obs_error_details = ? WHERE id = ?;''' % SrcPackage.sql_table, (error, details, row['id']))
                    return True

            except KeyError:
                # this happens when the parent package doesn't exist; link will
                # be broken, so we already have an error
                pass

            return False


        devel_package_cache = {}
        cursor_helper = self._dbconn.cursor()

        self._cursor.execute('''SELECT name FROM %s;''' % Project.sql_table)
        for row in self._cursor:
            devel_package_cache[row['name']] = {}

        self._cursor.execute('''SELECT A.name, A.devel_project, A.devel_package, B.name AS project FROM %s AS A, %s AS B WHERE A.project = B.id;''' % (SrcPackage.sql_table, Project.sql_table))
        for row in self._cursor:
            devel_package = row['devel_package'] or row['name']
            devel_package_cache[row['project']][row['name']] = (row['devel_project'], devel_package)

        self._cursor.execute('''SELECT A.id, A.name, A.obs_error, A.link_project, A.link_package, B.name AS project, B.parent AS project_parent FROM %s AS A, %s AS B WHERE A.project = B.id;''' % (SrcPackage.sql_table, Project.sql_table))
        for row in self._cursor:
            if _not_link_and_not_in_parent(devel_package_cache, cursor_helper, row):
                continue

            if _not_real_devel_package(devel_package_cache, cursor_helper, row):
                continue

        cursor_helper.close()

07070100000010000081A40000000000000000000000016548EB8C0000482B000000000000000000000000000000000000003300000000osc-plugin-collab-0.104+30/server/obs-db/hermes.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os
import sys

import re
import urllib.parse

import feedparser

#######################################################################


class HermesException(Exception):
    pass


#######################################################################


# Note: we subclass object because we need super
class HermesEvent(object):

    regexp = None
    raw_type = None

    def __init__(self, id, title, summary):
        self.id = id
        self.project = None
        self.package = None
        self.raw = False

        if self.raw_type:
            if title == 'Notification %s arrived!' % self.raw_type:
                self.raw = True
                for line in summary.split('\n'):
                    if not self.project and line.startswith('   project = '):
                        self.project = line[len('   project = '):]
                    elif not self.package and line.startswith('   package = '):
                        self.package = line[len('   package = '):]


    @classmethod
    def is_type_for_title(cls, title):
        """ Determines if a feed entry belongs to the event class.

            The match is based on the title of the feed entry, that is passed
            through a regular expression.

        """
        if cls.raw_type:
            if title == 'Notification %s arrived!' % cls.raw_type:
                return True

        if not cls.regexp:
            return False

        match = cls.regexp.match(title)
        return match != None


    def is_project_event(self):
        """ Return True if the event is for a project and not a package. """
        return False


    def is_package_event(self):
        """ Return True if the event is for a package and not a project. """
        return False


#######################################################################


class HermesEventCommit(HermesEvent):

    regexp = re.compile('OBS ([^/\s]*)/([^/\s]*) r\d* commited')
    raw_type = 'obs_srcsrv_commit'

    def __init__(self, id, title, summary):
        HermesEvent.__init__(self, id, title, summary)
        if self.raw:
            return

        match = self.regexp.match(title)

        # for some reason, not using str() sometimes make our requests to the
        # build service using those variables fail. I have absolutely no reason
        # why. It fails with "X-Error-Info: Request Line did not contain
        # request URI. The request that was received does not appear to be a
        # valid HTTP request. Please verify that your application uses HTTP"
        self.project = str(match.group(1))
        self.package = str(match.group(2))


    def is_package_event(self):
        return True


#######################################################################


class HermesEventProjectDeleted(HermesEvent):

    regexp = re.compile('\[obs del\] Project ([^/\s]*) deleted')
    raw_type = 'OBS_SRCSRV_DELETE_PROJECT'

    def __init__(self, id, title, summary):
        HermesEvent.__init__(self, id, title, summary)
        if self.raw:
            return

        match = self.regexp.match(title)

        self.project = str(match.group(1))


    def is_project_event(self):
        return True


#######################################################################


class HermesEventPackageMeta(HermesEvent):

    regexp = re.compile('\[obs update\] Package ([^/\s]*) in ([^/\s]*) updated')
    raw_type = 'OBS_SRCSRV_UPDATE_PACKAGE'

    def __init__(self, id, title, summary):
        HermesEvent.__init__(self, id, title, summary)
        if self.raw:
            return

        match = self.regexp.match(title)

        self.project = str(match.group(2))
        self.package = str(match.group(1))


    def is_package_event(self):
        return True


#######################################################################


class HermesEventPackageAdded(HermesEvent):

    regexp = re.compile('\[obs new\] New Package ([^/\s]*) ([^/\s]*)')
    # Workaround again buggy messages
    workaround_regexp = re.compile('\[obs new\] New Package\s*$')
    raw_type = 'OBS_SRCSRV_CREATE_PACKAGE'

    @classmethod
    def is_type_for_title(cls, title):
        if super(HermesEventPackageAdded, cls).is_type_for_title(title):
            return True
        else:
            match = cls.workaround_regexp.match(title)
            return match != None

    def __init__(self, id, title, summary):
        HermesEvent.__init__(self, id, title, summary)
        if self.raw:
            return

        match = self.regexp.match(title)

        if match:
            # Hermes previously said "Package $PGK in $PRJ"
            if str(match.group(2)) == 'in':
                raise HermesException('Old format of hermes message detected: %s' % title)

            self.project = str(match.group(2))
            self.package = str(match.group(1))
        else:
            match = self.workaround_regexp.match(title)
            if match != None:
                self.project = ''
                self.package = ''
            else:
                raise HermesException('Event should not be in PackagedAdded: %s' % title)


    def is_package_event(self):
        return True


#######################################################################


class HermesEventPackageDeleted(HermesEvent):

    regexp = re.compile('\[obs del\] Package ([^/\s]*) from ([^/\s]*) deleted')
    raw_type = 'OBS_SRCSRV_DELETE_PACKAGE'

    def __init__(self, id, title, summary):
        HermesEvent.__init__(self, id, title, summary)
        if self.raw:
            return

        match = self.regexp.match(title)

        self.project = str(match.group(2))
        self.package = str(match.group(1))


    def is_package_event(self):
        return True


#######################################################################


class HermesReader:

    types = [ HermesEventCommit, HermesEventProjectDeleted, HermesEventPackageMeta, HermesEventPackageAdded, HermesEventPackageDeleted ]


    def __init__(self, last_known_id, base_url, feeds, conf):
        """ Arguments:
            last_known_id -- id of the last known event, so the hermes reader
                             can know where to stop.
            base_url -- the base url for the hermes server.
            feeds -- a list of feed ids. They will be used to get a merged feed
                     from the hermes server.
            conf -- configuration object

        """
        self._events = []
        self.last_known_id = last_known_id

        self._previous_last_known_id = int(last_known_id)
        self._conf = conf

        if not base_url or not feeds:
            self._feed = None
            self._debug_print('No defined feed')
        else:
            resource = '/feeds/' + ','.join(feeds) + '.rdf'
            self._feed = urllib.parse.urljoin(base_url, resource)
            self._debug_print('Feed to be used: %s' % self._feed)

        self._last_parsed_id = -1


    def _debug_print(self, s):
        """ Print s if debug is enabled. """
        if self._conf.debug:
            print('HermesReader: %s' % s)


    def _get_entry_id(self, entry):
        """ Gets the hermes id of the event.
        
            This is an integer that we can compare with other ids.

        """
        entry_id = entry['id']
        id = os.path.basename(entry_id)

        try:
            return int(id)
        except ValueError:
            raise HermesException('Cannot get event id from: %s' % entry_id)


    def _parse_entry(self, id, entry):
        """ Return an event object based on the entry. """
        title = entry['title']

        for type in self.types:
            if type.is_type_for_title(title):
                return type(id, title, entry['summary'])

        # work around some weird hermes bug
        if title in [ 'Notification  arrived!', 'Notification unknown type arrived!' ]:
            return None

        raise HermesException('Cannot get event type from message %d: "%s"' % (id, title))


    def _parse_feed(self, url):
        """ Parses the feed to get events that are somehow relevant.

            This function ignores entries older than the previous last known id.

            Return True if the feed was empty.

        """
        feed = feedparser.parse(url)

        if len(feed['entries']) == 0:
            return True

        for entry in feed['entries']:
            error_encoded = False

            id = self._get_entry_id(entry)
            if id <= self._previous_last_known_id:
                continue
            if id > self._last_parsed_id:
                self._last_parsed_id = id

            try:
                event = self._parse_entry(id, entry)
            except UnicodeEncodeError as e:
                error_encoded = True
                event = None
                print('Cannot convert hermes message %d to str: %s' % (id, e), file=sys.stderr)

            # Note that hermes can be buggy and give events without the proper
            # project/package. If it's '' and not None, then it means it has
            # been changed to something empty (and therefore it's a bug from
            # hermes).
            if (event and
                event.project != '' and
                not (event.is_package_event() and event.package == '')):
                # put the id in the tuple so we can sort the list later
                self._events.append((id, event))
            # in case of UnicodeEncodeError, we already output a message
            elif not error_encoded:
                print('Buggy hermes message %d (%s): "%s".' % (id, entry['updated'], entry['title']), file=sys.stderr)
                print('----------', file=sys.stderr)
                for line in entry['summary'].split('\n'):
                    print('> %s' % line, file=sys.stderr)
                print('----------', file=sys.stderr)

            if id > self.last_known_id:
                self.last_known_id = id

        return False


    def _append_data_to_url(self, url, data):
        """ Append data to the query arguments passed to url. """
        if url.find('?') != -1:
            return '%s&%s' % (url, data)
        else:
            return '%s?%s' % (url, data)


    def fetch_last_known_id(self):
        """ Read the first feed just to get a last known id. """
        self._debug_print('Fetching new last known id')

        # we don't ignore self._conf.skip_hermes if we don't have a last known
        # id, since it can only harm by creating a later check for all projects
        # on the build service, which is expensive
        if self._conf.skip_hermes and self.last_known_id != -1:
            return

        if not self._feed:
            return

        feed = feedparser.parse(self._feed)
        for entry in feed['entries']:
            id = self._get_entry_id(entry)
            if id > self.last_known_id:
                self.last_known_id = id


    def _read_feed(self, feed_url):
        """ Read events from hermes, and populates the events item. """
        self._last_parsed_id = -1
        page = 1
        if self._previous_last_known_id > 0:
            url = self._append_data_to_url(feed_url, 'last_id=%d' % self._previous_last_known_id)
        else:
            raise HermesException('Internal error: trying to parse feeds while there is no last known id')

        if self._conf.skip_hermes:
            return

        while True:
            if page > 100:
                raise HermesException('Parsing too many pages: last parsed id is %d, last known id is %d' % (self._last_parsed_id, self._previous_last_known_id))

            self._debug_print('Parsing %s' % url)

            old_last_parsed_id = self._last_parsed_id

            empty_feed = self._parse_feed(url)
            if empty_feed:
                break
            elif old_last_parsed_id >= self._last_parsed_id:
                # this should never happen, as if we don't have an empty feeed, it
                # means we progress
                raise HermesException('No progress when parsing pages: last parsed id is %d, last known id is %d' % (self._last_parsed_id, self._previous_last_known_id))

            page += 1
            url = self._append_data_to_url(feed_url, 'last_id=%d' % self._last_parsed_id)


    def read(self):
        """ Read events from hermes, and populates the events item. """
        # Make sure we don't append events to some old values
        self._events = []

        if self._feed:
            self._read_feed(self._feed)

        # Sort to make sure events are in the reverse chronological order
        self._events.sort(reverse = True)

        self._debug_print('Number of events: %d' % len(self._events))
        if len(self._events) == 0:
            return

        self._debug_print('Events (reverse sorted): %s' % [ id for (id, event) in self._events ])

        self._strip()

        self._debug_print('Number of events after strip: %d' % len(self._events))


    def _strip(self):
        """ Strips events that we can safely ignore.

            For example, we can ignore multiple commits, or commits that were
            done before a deletion.

        """
        meta_changed = []
        changed = []
        deleted = []

        new_events = []

        # Note: the event list has the most recent event first
        # FIXME: we should do a first pass in the reverse order to know which
        # packages were added, and then later removed, so we can also strip the
        # remove event below.

        for (id, event) in self._events:
            # Ignore event if the project was deleted after this event
            if (event.project, None) in deleted:
                continue
            # Ignore event if the package was deleted after this event
            if event.package and (event.project, event.package) in deleted:
                continue

            if isinstance(event, HermesEventCommit):
                # Ignore commit event if the package was re-committed
                # afterwards
                if (event.project, event.package) in changed:
                    continue
                changed.append((event.project, event.package))
                new_events.append((id, event))

            elif isinstance(event, HermesEventProjectDeleted):
                deleted.append((event.project, None))
                new_events.append((id, event))

            elif isinstance(event, HermesEventPackageMeta):
                # Ignore meta event if the meta of the package was changed
                # afterwards
                if (event.project, event.package) in meta_changed:
                    continue
                meta_changed.append((event.project, event.package))
                new_events.append((id, event))

            elif isinstance(event, HermesEventPackageAdded):
                # Ignore added event if the package was re-committed
                # afterwards and meta was changed
                if (event.project, event.package) in meta_changed and (event.project, event.package) in changed:
                    continue
                changed.append((event.project, event.package))
                meta_changed.append((event.project, event.package))
                new_events.append((id, event))

            elif isinstance(event, HermesEventPackageDeleted):
                # Ignore deleted event if the package was re-committed
                # afterwards (or meta was changed)
                if (event.project, event.package) in meta_changed:
                    continue
                if (event.project, event.package) in changed:
                    continue
                deleted.append((event.project, event.package))
                new_events.append((id, event))

        self._events = new_events


    def get_events(self, last_known_id = -1, reverse = False):
        """ Return the list of events that are more recent than last_known_id. """
        result = []

        for (id, event) in self._events:
            if id <= last_known_id:
                break
            result.append(event)

        if reverse:
            result.reverse()

        return result

#######################################################################


def main(args):
    class Conf:
        def __init__(self):
            self.debug = True
            self.skip_hermes = False

    feeds = [ '25545', '25547', '55386', '55387', '55388' ]
    last_known_id = 10011643

    reader = HermesReader(last_known_id, 'https://hermes.opensuse.org/', feeds, Conf())
    reader.read()

    print('Number of events: %d' % len(reader.get_events(2094133)))
    print('Last known event: %d' % reader.last_known_id)


if __name__ == '__main__':
    try:
      main(sys.argv)
    except KeyboardInterrupt:
      pass
07070100000011000081A40000000000000000000000016548EB8C00003745000000000000000000000000000000000000003400000000osc-plugin-collab-0.104+30/server/obs-db/infoxml.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os
import sys

import errno
import filecmp

try:
    from lxml import etree as ET
except ImportError:
    try:
        from xml.etree import cElementTree as ET
    except ImportError:
        import cElementTree as ET

import database
import util


#######################################################################


# Create a global dictionary that will contain the name of the SQL tables, for
# easier use
SQL_TABLES = {}
for attrname in list(database.__dict__.keys()):
    attr = database.__getattribute__(attrname)
    if hasattr(attr, 'sql_table'):
        SQL_TABLES[attrname] = attr.sql_table


#######################################################################


class InfoXmlException(Exception):
    pass


#######################################################################


class InfoXml:

    def __init__(self, dest_dir, debug = False):
        self.dest_dir = dest_dir
        self._debug = debug

        self._version_cache = None

    def _debug_print(self, s):
        """ Print s if debug is enabled. """
        if self._debug:
            print('XML: %s' % s)

    def _get_version(self, project, package):
        """ Gets the version of a package, in a safe way. """
        try:
            return self._version_cache[project][package]
        except KeyError as e:
            return None

    def _get_package_node_from_row(self, row, ignore_upstream, default_parent_project):
        """ Get the XML node for the package defined in row. """
        name = row['name']
        version = row['version']
        link_project = row['link_project']
        link_package = row['link_package']
        devel_project = row['devel_project']
        devel_package = row['devel_package']
        upstream_version = row['upstream_version']
        upstream_url = row['upstream_url']
        is_link = row['is_obs_link']
        has_delta = row['obs_link_has_delta']
        error = row['obs_error']
        error_details = row['obs_error_details']

        parent_version = None
        devel_version = None

        package = ET.Element('package')
        package.set('name', name)

        if link_project:
            if (link_project != default_parent_project) or (link_package and link_package != name):
                node = ET.SubElement(package, 'parent')
                node.set('project', link_project)
                if link_package and link_package != name:
                    node.set('package', link_package)
            parent_version = self._get_version(link_project, link_package or name)
        elif default_parent_project:
            parent_version = self._get_version(default_parent_project, name)

        if devel_project:
            node = ET.SubElement(package, 'devel')
            node.set('project', devel_project)
            if devel_package and devel_package != name:
                node.set('package', devel_package)
            devel_version = self._get_version(devel_project, devel_package or name)

        if version or upstream_version or parent_version or devel_version:
            node = ET.SubElement(package, 'version')
            if version:
                node.set('current', version)
            if upstream_version:
                node.set('upstream', upstream_version)
            if parent_version:
                node.set('parent', parent_version)
            if devel_version:
                node.set('devel', devel_version)

        if upstream_url:
            upstream = ET.SubElement(package, 'upstream')
            if upstream_url:
                node = ET.SubElement(upstream, 'url')
                node.text = upstream_url

        if is_link:
            node = ET.SubElement(package, 'link')
            if has_delta:
                node.set('delta', 'true')
            else:
                node.set('delta', 'false')
        # deep delta (ie, delta in non-link packages)
        elif has_delta:
            node = ET.SubElement(package, 'delta')

        if error:
            node = ET.SubElement(package, 'error')
            node.set('type', error)
            if error_details:
                node.text = error_details

        return package

    def _get_project_node(self, cursor, project):
        """ Get the XML node for project. """
        cursor.execute('''SELECT * FROM %(Project)s WHERE name = ?;''' % SQL_TABLES, (project,))
        row = cursor.fetchone()

        if not row:
            raise InfoXmlException('Non-existing project: %s' % project)

        if project not in self._version_cache:
            raise InfoXmlException('Version cache was not created correctly: %s is not in the cache' % project)

        project_id = row['id']
        parent_project = row['parent']
        ignore_upstream = row['ignore_upstream']

        prj_node = ET.Element('project')
        prj_node.set('name', project)
        if parent_project:
            prj_node.set('parent', parent_project)
        if ignore_upstream:
            prj_node.set('ignore_upstream', 'true')

        should_exist = {}
        cursor.execute('''SELECT A.name AS parent_project, B.name AS parent_package, B.devel_package
                          FROM %(Project)s AS A, %(SrcPackage)s AS B
                          WHERE A.id = B.project AND devel_project = ?
                          ORDER BY A.name, B.name;''' % SQL_TABLES, (project,))
        for row in cursor:
            should_parent_project = row['parent_project']
            should_parent_package = row['parent_package']
            should_devel_package = row['devel_package'] or should_parent_package
            should_exist[should_devel_package] = (should_parent_project, should_parent_package)

        cursor.execute('''SELECT * FROM %(SrcPackage)s
                          WHERE project = ?
                          ORDER BY name;''' % SQL_TABLES, (project_id,))
        for row in cursor:
            pkg_node = self._get_package_node_from_row(row, ignore_upstream, parent_project)
            prj_node.append(pkg_node)
            try:
                del should_exist[row['name']]
            except KeyError:
                pass

        if len(should_exist) > 0:
            missing_node = ET.Element('missing')
            for (should_package_name, (should_parent_project, should_parent_package)) in should_exist.items():
                missing_pkg_node = ET.Element('package')

                missing_pkg_node.set('name', should_package_name)
                missing_pkg_node.set('parent_project', should_parent_project)
                if should_package_name != should_parent_package:
                    missing_pkg_node.set('parent_package', should_parent_package)

                missing_node.append(missing_pkg_node)

            prj_node.append(missing_node)

        return prj_node

    def _create_version_cache(self, cursor, projects = None):
        """ Creates a cache containing version of all packages. """
        # This helps us avoid doing many small SQL queries, which is really
        # slow.
        #
        # The main difference is that we do one SQL query + many hash accesses,
        # vs 2*(total number of packages in the database) SQL queries. On a
        # test run, the difference results in ~1min15s vs ~5s. That's a 15x
        # time win.
        self._version_cache = {}

        if not projects:
            cursor.execute('''SELECT name FROM %(Project)s;''' % SQL_TABLES)
            projects = [ row['name'] for row in cursor ]

        for project in projects:
            self._version_cache[project] = {}

        cursor.execute('''SELECT A.name, A.version, B.name AS project
                          FROM %(SrcPackage)s AS A, %(Project)s AS B
                          WHERE A.project = B.id;''' % SQL_TABLES)

        for row in cursor:
            self._version_cache[row['project']][row['name']] = row['version']

    def _write_xml_for_project(self, cursor, project):
        """ Writes the XML file for a project.

            Note that we don't touch the old file if the result is the same.
            This can be useful for browser cache.

        """
        node = self._get_project_node(cursor, project)

        filename = os.path.join(self.dest_dir, project + '.xml')
        tmpfilename = filename + '.tmp'

        tree = ET.ElementTree(node)

        try:
            tree.write(tmpfilename)

            # keep the old file if there's no change (useful when downloaded
            # from the web to not re-download again the file)
            if os.path.exists(filename):
                if filecmp.cmp(filename, tmpfilename, shallow = False):
                    self._debug_print('XML for %s did not change' % project)
                    os.unlink(tmpfilename)
                    return

            os.rename(tmpfilename, filename)
        except Exception as e:
            if os.path.exists(tmpfilename):
                os.unlink(tmpfilename)
            raise e

    def run(self, cursor, changed_projects = None):
        """ Creates the XML files for all projects.

            changed_projects -- The list of projects for which we need to
                                generate a XML file. "None" means all projects.

        """
        if not cursor:
            raise InfoXmlException('Database needed to create XML files is not available.')

        util.safe_mkdir_p(self.dest_dir)

        cursor.execute('''SELECT name FROM %(Project)s;''' % SQL_TABLES)
        projects = [ row['name'] for row in cursor ]

        self._create_version_cache(cursor, projects)

        if changed_projects is not None:
            # We have a specific list of projects for which we need to create
            # the XML. Note that None and [] don't have the same meaning.
            if not changed_projects:
                return

            # Get the list of projects containing a package which links to a
            # changed project, or which has a a devel project that has changed
            where = ' OR '.join([ 'B.link_project = ? OR B.devel_project = ?' for i in range(len(changed_projects)) ])
            where_args = []
            for changed_project in changed_projects:
                where_args.append(changed_project)
                where_args.append(changed_project)

            mapping = SQL_TABLES.copy()
            mapping['where'] = where

            cursor.execute('''SELECT A.name FROM %(Project)s as A, %(SrcPackage)s as B
                              WHERE A.id = B.project AND (%(where)s)
                              GROUP BY A.name
                              ;''' % mapping, where_args)

            changed_projects = set(changed_projects)
            for (project,) in cursor:
                changed_projects.add(project)

            projects = changed_projects

        for project in projects:
            self._debug_print('Writing XML for %s' % project)
            self._write_xml_for_project(cursor, project)

    def remove_project(self, project):
        filename = os.path.join(self.dest_dir, project + '.xml')
        if os.path.exists(filename):
            os.unlink(filename)


#######################################################################


def main(args):
    import sqlite3

    if len(args) != 3:
        print('Usage: %s dbfile project' % args[0], file=sys.stderr)
        sys.exit(1)

    filename = args[1]
    project = args[2]

    if not os.path.exists(filename):
        print('%s does not exist.' % filename, file=sys.stderr)
        sys.exit(1)

    try:
        db = sqlite3.connect(filename)
    except sqlite3.OperationalError as e:
        print('Error while opening %s: %s' % (filename, e), file=sys.stderr)
        sys.exit(1)

    db.row_factory = sqlite3.Row
    db.text_factory = sqlite3.OptimizedUnicode
    cursor = db.cursor()

    info = InfoXml('.', True)

    try:
        info._create_version_cache(cursor)
        node = info._get_project_node(cursor, project)
    except InfoXmlException as e:
        print('Error while creating the XML for %s: %s' % (project, e), file=sys.stderr)
        sys.exit(1)

    tree = ET.ElementTree(node)
    try:
        print(ET.tostring(tree, pretty_print = True))
    except TypeError:
        # pretty_print only works with lxml
        tree.write(sys.stdout)

    cursor.close()
    db.close()


if __name__ == '__main__':
    try:
      main(sys.argv)
    except KeyboardInterrupt:
      pass
    except IOError as e:
        if e.errno == errno.EPIPE:
            pass
07070100000012000081ED0000000000000000000000016548EB8C0000071D000000000000000000000000000000000000003000000000osc-plugin-collab-0.104+30/server/obs-db/obs-db#!/usr/bin/env python3
# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import sys

import shell

try:
    ret = shell.main(sys.argv)
    sys.exit(ret)
except KeyboardInterrupt:
    pass
07070100000013000081ED0000000000000000000000016548EB8C00000AF1000000000000000000000000000000000000003D00000000osc-plugin-collab-0.104+30/server/obs-db/obs-manual-checkout#!/usr/bin/env python3
# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os
import sys

import traceback

import buildservice
import shellutils


def main(args):
    (args, options, conf) = shellutils.get_conf(args)
    if not conf:
        return 1

    if len(args) not in [1, 2]:
        print("Please specify a project or a package.", file=sys.stderr)
        return 1

    project = args[0]
    if len(args) == 2:
        package = args[1]
    else:
        package = None

    if not shellutils.lock_run(conf):
        return 1

    retval = 1

    try:
        mirror_dir = os.path.join(conf.cache_dir, 'obs-mirror')
        obs = buildservice.ObsCheckout(conf, mirror_dir)

        if not package:
            obs.queue_checkout_project(project)
        else:
            obs.queue_checkout_package(project, package)
            obs.queue_checkout_package_meta(project, package)

        obs.run()

        retval = 0
    except Exception as e:
        traceback.print_exc()

    shellutils.unlock_run(conf)

    return retval


if __name__ == '__main__':
    try:
      ret = main(sys.argv)
      sys.exit(ret)
    except KeyboardInterrupt:
      pass
07070100000014000081ED0000000000000000000000016548EB8C00003783000000000000000000000000000000000000004100000000osc-plugin-collab-0.104+30/server/obs-db/obs-upstream-attributes#!/usr/bin/env python3
# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2010, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os
import sys

import optparse
import socket
import traceback
import urllib.request, urllib.error, urllib.parse

try:
    from lxml import etree as ET
except ImportError:
    try:
        from xml.etree import cElementTree as ET
    except ImportError:
        import cElementTree as ET

from osc import core

import config
import database
import shellutils
import upstream


#######################################################################


class UpstreamAttributesException(Exception):
    pass


#######################################################################


def package_check_attribute_valid_arguments(project, package, namespace, name):
    if not project:
        raise UpstreamAttributesException('Internal error: no project defined')

    if not package:
        raise UpstreamAttributesException('Internal error: no package defined')

    if not namespace:
        raise UpstreamAttributesException('Internal error: no namespace defined')

    if not name:
        raise UpstreamAttributesException('Internal error: no name defined')


#######################################################################


def package_get_attribute(apiurl, project, package, namespace, name, try_again = True):
    package_check_attribute_valid_arguments(project, package, namespace, name)

    attribute_name = '%s:%s' % (namespace, name)
    url = core.makeurl(apiurl, ['source', project, package, '_attribute', attribute_name])

    try:
        fin = core.http_GET(url)
    except (urllib.error.HTTPError, urllib.error.URLError, socket.error) as e:
        if type(e) == urllib.error.HTTPError and e.code == 404:
            print('Package %s in project %s doesn\'t exist.' % (package, project), file=sys.stderr)
        elif try_again:
            return package_get_attribute(apiurl, project, package, namespace, name, False)
        else:
            raise UpstreamAttributesException('Cannot look for attribute %s for package %s in project %s: %s' % (attribute_name, package, project, e))

        return None

    try:
        attributes_node = ET.parse(fin).getroot()
    except SyntaxError as e:
        fin.close()
        raise UpstreamAttributesException('Cannot look for attribute %s for package %s in project %s: %s' % (attribute_name, package, project, e))

    fin.close()

    for attribute_node in attributes_node.findall('attribute'):
        if attribute_node.get('namespace') == namespace and attribute_node.get('name') == name:
            value = []
            for value_node in attribute_node.findall('value'):
                value.append(value_node.text)
            if not value:
                return ''
            elif len(value) == 1:
                return value[0]
            else:
                return value

    return None


#######################################################################


def package_has_attribute(apiurl, project, package, namespace, name):
    return package_get_attribute(apiurl, project, package, namespace, name) is not None


#######################################################################


def package_set_attribute_handle_reply(fin, error_str):
    try:
        node = ET.parse(fin).getroot()
    except SyntaxError as e:
        fin.close()
        raise UpstreamAttributesException('s: %s' % (error_str, e))

    fin.close()

    if node.get('code') != 'ok':
        try:
            summary = node.find('summary').text
        except:
            summary = 'Unknown error'

        try:
            details = node.find('details').text
        except:
            details = ''

        if details:
            raise UpstreamAttributesException('%s: %s (%s)' % (error_str, summary, details))
        else:
            raise UpstreamAttributesException('%s: %s' % (error_str, summary))


def get_xml_for_attributes(attributes_values):
    if len(attributes_values) == 0:
        return None

    attributes_node = ET.Element('attributes')

    for (namespace, name, value) in attributes_values:
        attribute_node = ET.SubElement(attributes_node, 'attribute')
        attribute_node.set('namespace', namespace)
        attribute_node.set('name', name)
        value_node = ET.SubElement(attribute_node, 'value')
        value_node.text = value

    return ET.tostring(attributes_node)


#######################################################################


def package_unset_attribute(apiurl, project, package, namespace, name, try_again = True):
    package_check_attribute_valid_arguments(project, package, namespace, name)

    attribute_name = '%s:%s' % (namespace, name)
    error_str = 'Cannot unset attribute %s for package %s in project %s' % (attribute_name, package, project)

    if not package_has_attribute(apiurl, project, package, namespace, name):
        return

    url = core.makeurl(apiurl, ['source', project, package, '_attribute', attribute_name])

    try:
        fin = core.http_DELETE(url)
    except (urllib.error.HTTPError, urllib.error.URLError, socket.error) as e:
        if type(e) == urllib.error.HTTPError and e.code == 404:
            print('Package %s in project %s doesn\'t exist.' % (package, project), file=sys.stderr)
        elif try_again:
            package_unset_attribute(apiurl, project, package, namespace, name, False)
        else:
            raise UpstreamAttributesException('%s: %s' % (error_str, e))

        return

    package_set_attribute_handle_reply(fin, error_str)


#######################################################################


def package_set_attributes(apiurl, project, package, attributes, try_again = True):
    for (namespace, name, value) in attributes:
        package_check_attribute_valid_arguments(project, package, namespace, name)
        if value == None:
            raise UpstreamAttributesException('Internal error: no value defined')

    if len(attributes) == 1:
        # namespace/name are set because of the above loop
        attribute_name = '%s:%s' % (namespace, name)
        error_str = 'Cannot set attribute %s for package %s in project %s' % (attribute_name, package, project)
    else:
        error_str = 'Cannot set attributes for package %s in project %s' % (package, project)

    xml = get_xml_for_attributes(attributes)
    url = core.makeurl(apiurl, ['source', project, package, '_attribute'])

    try:
        fin = core.http_POST(url, data = xml)
    except (urllib.error.HTTPError, urllib.error.URLError, socket.error) as e:
        if type(e) == urllib.error.HTTPError and e.code == 404:
            print('Package %s in project %s doesn\'t exist.' % (package, project), file=sys.stderr)
        elif try_again:
            package_set_attributes(apiurl, project, package, attributes, False)
        else:
            raise UpstreamAttributesException('%s: %s' % (error_str, e))

        return

    package_set_attribute_handle_reply(fin, error_str)


def package_set_attribute(apiurl, project, package, namespace, name, value):
    attributes = [ (namespace, name, value) ]
    package_set_attributes(apiurl, project, package, attributes)

#######################################################################


def package_set_upstream_attributes(apiurl, project, package, upstream_version, upstream_url, ignore_empty = False):
    if upstream_version and upstream_url:
        attributes = [ ('openSUSE', 'UpstreamVersion', upstream_version), ('openSUSE', 'UpstreamTarballURL', upstream_url) ]
        package_set_attributes(apiurl, project, package, attributes)
        return

    if upstream_version:
        package_set_attribute(apiurl, project, package, 'openSUSE', 'UpstreamVersion', upstream_version)
    elif not ignore_empty:
        package_unset_attribute(apiurl, project, package, 'openSUSE', 'UpstreamVersion')

    if upstream_url:
        package_set_attribute(apiurl, project, package, 'openSUSE', 'UpstreamTarballURL', upstream_url)
    elif not ignore_empty:
        package_unset_attribute(apiurl, project, package, 'openSUSE', 'UpstreamTarballURL')


#######################################################################


def run(conf, do_projects = None, initial = False):
    status_file = os.path.join(conf.cache_dir, 'status', 'attributes')
    failed_file = os.path.join(conf.cache_dir, 'status', 'attributes-failed')
    db_dir = os.path.join(conf.cache_dir, 'db')
    mirror_dir = os.path.join(conf.cache_dir, 'obs-mirror')

    status = {}
    status['upstream-mtime'] = -1

    status = shellutils.read_status(status_file, status)

    # Get packages that we had to update before, but where we failed
    old_failed = []
    if os.path.exists(failed_file):
        failed_f = open(failed_file)
        lines = failed_f.readlines()
        failed_f.close()
        for line in lines:
            line = line[:-1]
            try:
                (project, package, upstream_version, upstream_url) = line.split('|', 3)
            except ValueError:
                raise UpstreamAttributesException('Invalid failed attribute line: %s' % line)

            old_failed.append((project, package, upstream_version, upstream_url))

    # Get packages we need to update
    upstreamdb = upstream.UpstreamDb(None, db_dir, conf.debug)
    new_upstream_mtime = upstreamdb.get_mtime()
    db = database.ObsDb(conf, db_dir, mirror_dir, upstreamdb)

    projects = db.get_packages_with_upstream_change(status['upstream-mtime'])

    # close the database as soon as we don't need them anymore
    del db
    del upstreamdb

    failed = []

    for (project, package, upstream_version, upstream_url) in old_failed:
        try:
            if conf.debug:
                print('UpstreamAttributes: %s/%s' % (project, package))
            package_set_upstream_attributes(conf.apiurl, project, package, upstream_version, upstream_url)
        except UpstreamAttributesException as e:
            print(e, file=sys.stderr)
            failed.append((project, package, upstream_version, upstream_url))

    # Remove the failed file as soon as we know it was handled
    if os.path.exists(failed_file):
        os.unlink(failed_file)

    for (project, packages) in list(projects.items()):
        if do_projects and project not in do_projects:
            continue

        for (package, (upstream_version, upstream_url)) in list(packages.items()):
            try:
                if conf.debug:
                    print('UpstreamAttributes: %s/%s' % (project, package))
                package_set_upstream_attributes(conf.apiurl, project, package, upstream_version, upstream_url, ignore_empty = initial)
            except UpstreamAttributesException as e:
                print(e, file=sys.stderr)
                failed.append((project, package, upstream_version, upstream_url))

    # Save the failed packages for next run
    if len(failed) > 0:
        failed_f = open(failed_file, 'w')
        for (project, package, upstream_version, upstream_url) in failed:
            failed_f.write('|'.join((project, package, upstream_version, upstream_url)) + '\n')
        failed_f.close()

    # Save the status time last (saving it last ensures that everything has
    # been really handled)
    status['upstream-mtime'] = new_upstream_mtime
    shellutils.write_status(status_file, status)


#######################################################################


def main(args):
    parser = optparse.OptionParser()

    parser.add_option('--initial', dest='initial',
                      action='store_true',
                      help='initial setting of attributes (will ignore empty data, instead of deleting attributes)')
    parser.add_option('--project', dest='projects',
                      action='append', default = [],
                      metavar='PROJECT',
                      help='project to work on (default: all)')

    (args, options, conf) = shellutils.get_conf(args, parser)
    if not conf:
        return 1

    if not shellutils.lock_run(conf, 'attributes'):
        return 1

    retval = 1

    try:
        run(conf, options.projects, options.initial)
        retval = 0
    except Exception as e:
        if isinstance(e, (UpstreamAttributesException, shellutils.ShellException, config.ConfigException, database.ObsDbException)):
            print(e, file=sys.stderr)
        else:
            traceback.print_exc()

    shellutils.unlock_run(conf, 'attributes')

    return retval


if __name__ == '__main__':
    try:
      ret = main(sys.argv)
      sys.exit(ret)
    except KeyboardInterrupt:
      pass
07070100000015000081A40000000000000000000000016548EB8C00000491000000000000000000000000000000000000003500000000osc-plugin-collab-0.104+30/server/obs-db/osc_copy.py# vim: sw=4 et

# Copyright (C) 2006 Novell Inc.  All rights reserved.
# This program is free software; it may be used, copied, modified
# and distributed under the terms of the GNU General Public Licence,
# either version 2, or version 3 (at your option).

# This file contains copy of some trivial functions from osc that we want to
# use. It is copied here to avoid importing large python modules.

from urllib.parse import urlencode
from urllib.parse import urlsplit, urlunsplit

def makeurl(baseurl, l, query=[]):
    """Given a list of path compoments, construct a complete URL.

    Optional parameters for a query string can be given as a list, as a
    dictionary, or as an already assembled string.
    In case of a dictionary, the parameters will be urlencoded by this
    function. In case of a list not -- this is to be backwards compatible.
    """

    #print 'makeurl:', baseurl, l, query

    if type(query) == type(list()):
        query = '&'.join(query)
    elif type(query) == type(dict()):
        query = urlencode(query)

    scheme, netloc = urlsplit(baseurl)[0:2]
    return urlunsplit((scheme, netloc, '/'.join(l), query, ''))               
07070100000016000081ED0000000000000000000000016548EB8C0000117D000000000000000000000000000000000000002F00000000osc-plugin-collab-0.104+30/server/obs-db/runme#!/bin/sh
# vim: set ts=4 sw=4 et:

#
# Copyright (c) 2008-2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

basedir=`dirname $0`

## Options
# Do the rpmlint stuff
OBS_DO_RPMLINT=0

## Basic setup

CACHE_DIR=./cache
USE_OPENSUSE=
CONFIG_FILE=
LOG_FILE=

usage() {
    echo "Usage: $0 [-o CONF-FILE] [-l LOG-FILE] [-s]"
    echo ""
    echo "Options:"
    echo "   -o CONF-FILE     Use CONF-FILE as configuration file"
    echo "   -l LOG-FILE      Use LOG-FILE to log errors"
    echo "   -s               Use the openSUSE configuration file as a basis"
}

while getopts o:l:sh option; do
    case $option in
    o) CONFIG_FILE=$OPTARG;;
    l) LOG_FILE=$OPTARG;;
    s) USE_OPENSUSE=--opensuse;;
    h|help) usage; exit 0;;
    *) usage; exit 1;;
    esac
done

if test "x$CONFIG_FILE" != "x"; then
    if test ! -f $CONFIG_FILE; then
        echo >&2 "Configuration file $CONFIG_FILE does not exit."
        exit 1
    else
        OBS_OPTIONS_CACHE_DIR=`grep "^ *cache-dir =" $CONFIG_FILE | sed "s/.*= *\(.*\) *$/\1/g" | tail -n 1`
        test "x$OBS_OPTIONS_CACHE_DIR" != "x" && CACHE_DIR=$OBS_OPTIONS_CACHE_DIR
    fi
fi

mkdir -p $CACHE_DIR


##############################################################
# Copy the upstream name / package name match database

mkdir -p $CACHE_DIR/upstream

cmp --quiet $basedir/../upstream/upstream-packages-match.txt $CACHE_DIR/upstream/upstream-packages-match.txt
if test $? -ne 0; then
    cp -a $basedir/../upstream/upstream-packages-match.txt $CACHE_DIR/upstream/
fi


##############################################################
# Get the rpmlint data
# GNOME:Factory only

# We download the rpmlint data. We keep the old version around until we're sure
# the new version is fine.
get_rpmlint () {
    PROJECT=$1
    if test "x$1" = "x"; then
        return
    fi

    if test -f rpmlint.tar.bz2; then
        rm -f rpmlint.tar.bz2
    fi
    wget -q ftp://ftp.coolice.org/rpmlint/$PROJECT/rpmlint.tar.bz2
    if test $? -eq 0; then
        if test -d $PROJECT; then
            mv $PROJECT $PROJECT.old
        fi
        tar jxf rpmlint.tar.bz2
        if test $? -ne 0 -a -d $PROJECT.old; then
            mv $PROJECT.old $PROJECT
        fi
        if test -d $PROJECT.old; then
            rm -rf $PROJECT.old
        fi

        rm -f rpmlint.tar.bz2
    fi
}

if test "x$OBS_DO_RPMLINT" = "x1"; then
    mkdir -p $CACHE_DIR/rpmlint
    pushd $CACHE_DIR/rpmlint > /dev/null
    get_rpmlint openSUSE:Factory
    get_rpmlint GNOME:Factory
    get_rpmlint GNOME:Contrib
    popd > /dev/null
fi


##############################################################
# Check out everything and create the databases

CONFIG_OPTION=
if test "x$CONFIG_FILE" != "x"; then
    CONFIG_OPTION="--config $CONFIG_FILE"
fi

LOG_OPTION=
if test "x$LOG_FILE" != "x"; then
    LOG_OPTION="--log $LOG_FILE"
fi

$basedir/obs-db $CONFIG_OPTION $LOG_OPTION $USE_OPENSUSE
07070100000017000081ED0000000000000000000000016548EB8C00000BDB000000000000000000000000000000000000003A00000000osc-plugin-collab-0.104+30/server/obs-db/runme-attributes#!/bin/sh
# vim: set ts=4 sw=4 et:

#
# Copyright (c) 2008-2010, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

basedir=`dirname $0`

## Basic setup

CACHE_DIR=./cache
USE_OPENSUSE=
CONFIG_FILE=
LOG_FILE=

usage() {
    echo "Usage: $0 [-o CONF-FILE] [-l LOG-FILE] [-s]"
    echo ""
    echo "Options:"
    echo "   -o CONF-FILE     Use CONF-FILE as configuration file"
    echo "   -l LOG-FILE      Use LOG-FILE to log errors"
    echo "   -s               Use the openSUSE configuration file as a basis"
}

while getopts o:l:sh option; do
    case $option in
    o) CONFIG_FILE=$OPTARG;;
    l) LOG_FILE=$OPTARG;;
    s) USE_OPENSUSE=--opensuse;;
    h|help) usage; exit 0;;
    *) usage; exit 1;;
    esac
done

if test "x$CONFIG_FILE" != "x"; then
    if test ! -f $CONFIG_FILE; then
        echo >&2 "Configuration file $CONFIG_FILE does not exit."
        exit 1
    else
        OBS_OPTIONS_CACHE_DIR=`grep "^ *cache-dir =" $CONFIG_FILE | sed "s/.*= *\(.*\) *$/\1/g" | tail -n 1`
        test "x$OBS_OPTIONS_CACHE_DIR" != "x" && CACHE_DIR=$OBS_OPTIONS_CACHE_DIR
    fi
fi

mkdir -p $CACHE_DIR


##############################################################
# Update attributes in the build service

CONFIG_OPTION=
if test "x$CONFIG_FILE" != "x"; then
    CONFIG_OPTION="--config $CONFIG_FILE"
fi

LOG_OPTION=
if test "x$LOG_FILE" != "x"; then
    LOG_OPTION="--log $LOG_FILE"
fi

$basedir/obs-upstream-attributes $CONFIG_OPTION $LOG_OPTION $USE_OPENSUSE
07070100000018000081A40000000000000000000000016548EB8C00005E55000000000000000000000000000000000000003200000000osc-plugin-collab-0.104+30/server/obs-db/shell.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os
import sys

import errno
import optparse
import socket
import traceback

import buildservice
import config
import database
import hermes
import infoxml
import shellutils
import upstream
import util


#######################################################################


class RunnerException(Exception):
    pass


#######################################################################


class Runner:

    def __init__(self, conf):
        """ Arguments:
            config -- a config object

        """
        self.conf = conf
        self.hermes = None
        self.obs = None
        self.upstream = None
        self.db = None
        self.xml = None

        self._status_file = os.path.join(self.conf.cache_dir, 'status', 'last')
        self._status_catchup = os.path.join(self.conf.cache_dir, 'status', 'catchup')
        self._mirror_error = os.path.join(self.conf.cache_dir, 'status', 'mirror-error')
        self._mirror_dir = os.path.join(self.conf.cache_dir, 'obs-mirror')
        self._upstream_dir = os.path.join(self.conf.cache_dir, 'upstream')
        self._db_dir = os.path.join(self.conf.cache_dir, 'db')
        self._xml_dir = os.path.join(self.conf.cache_dir, 'xml')

        self._status = {}
        # Last hermes event handled by mirror
        self._status['mirror'] = -1
        # Last hermes event handled by db
        self._status['db'] = -1
        # Last hermes event recorded in xml (it cannot be greater than the db one)
        self._status['xml'] = -1
        # mtime of the configuration that was last known
        self._status['conf-mtime'] = -1
        # mtime of the openSUSE configuration that was last known
        self._status['opensuse-mtime'] = -1
        # mtime of the upstream database
        self._status['upstream-mtime'] = -1

        self._catchup = []


    def _debug_print(self, s):
        """ Print s if debug is enabled. """
        if self.conf.debug:
            print('Main: %s' % s)


    def _read_status(self):
        """ Read the last known status of the script. """
        self._status = shellutils.read_status(self._status_file, self._status)


    def _write_status(self):
        """ Save the last known status of the script. """
        shellutils.write_status(self._status_file, self._status)


    def _setup_catchup(self):
        """ Gets a list of packages that we need to update because of some
        reason: missing hermes message, bug somewhere, etc."""
        if not os.path.exists(self._status_catchup):
            return

        file = open(self._status_catchup)
        lines = file.readlines()

        catchup = set()

        for line in lines:
            line = line[:-1]
            s = line.split('/')
            if len(s) not in [1, 2]:
                print('Cannot handle catchup line: %s' % line, file=sys.stderr)
                continue

            if len(s) == 1:
                if not self.conf.allow_project_catchup:
                    print('Cannot handle catchup line: %s (per config, projects are ignored)' % line, file=sys.stderr)
                    continue

                (project,) = s
                catchup.add((project, None))

            elif len(s) == 2:
                (project, package) = s
                if not project:
                    print('Cannot handle catchup line: %s' % line, file=sys.stderr)
                    continue
                if not package and not self.conf.allow_project_catchup:
                    print('Cannot handle catchup line: %s (per config, projects are ignored)' % line, file=sys.stderr)
                    continue

                catchup.add((project, package))

        self._catchup = list(catchup)

        file.close()


    def _empty_catchup(self):
        # do not remove the catchup file if something still needs it on next
        # run
        if self.conf.skip_mirror or self.conf.skip_db or self.conf.skip_xml:
            return

        if os.path.exists(self._status_catchup):
            try:
                os.unlink(self._status_catchup)
            except Exception as e:
                print('Cannot remove catchup file: %s' % e, file=sys.stderr)


    def _had_mirror_error(self, project, package):
        """ Check if we had an error on mirror for this package. """
        if (project, package) in self.obs.errors:
            return True
        # just to be on the safe side, for project checks, we check with both
        # None and '' as package.
        if package is None and (project, '') in self.obs.errors:
            return True
        return False


    def _write_mirror_error(self):
        if len(self.obs.errors) == 0:
            return

        # note that we append
        fout = open(self._mirror_error, 'a')
        for (project, package) in self.obs.errors:
            if package:
                fout.write('%s/%s\n' % (project, package))
            else:
                fout.write('%s\n' % project)
        fout.close()


    def _move_mirror_error_to_catchup(self):
        if not os.path.exists(self._mirror_error):
            return

        # we don't do a simple cp, but copy the content since we want to append
        # if the catchup file already exists

        fin = open(self._mirror_error)
        lines = fin.readlines()
        fin.close

        fout = open(self._status_catchup, 'a')
        fout.writelines(lines)
        fout.close()

        try:
            os.unlink(self._mirror_error)
        except Exception as e:
            print('Cannot remove mirror-error file: %s' % e, file=sys.stderr)


    def _run_mirror(self, conf_changed):
        if not os.path.exists(self._mirror_dir):
            os.makedirs(self._mirror_dir)

        if self.conf.skip_mirror:
            return

        # keep in sync this boolean expression and the one used for no_full_check
        if not self.conf.force_hermes and (self._status['mirror'] == -1 or conf_changed):
            # we don't know how old our mirror is, or the configuration has
            # changed

            # get a max id from hermes feeds
            self.hermes.fetch_last_known_id()

            # checkout the projects (or look if we need to update them)
            for name in list(self.conf.projects.keys()):
                if self.conf.mirror_only_new:
                    if os.path.exists(os.path.join(self._mirror_dir, name)):
                        continue

                self.obs.queue_checkout_project(name)

        else:
            # update the relevant part of the mirror

            # get events from hermes
            self.hermes.read()

            # reverse to have chronological order
            events = self.hermes.get_events(self._status['mirror'], reverse = True)

            for event in events:
                # ignore events that belong to a project we do not monitor
                # (ie, there's no checkout)
                project_dir = os.path.join(self._mirror_dir, event.project)
                if not os.path.exists(project_dir):
                    continue

                if isinstance(event, hermes.HermesEventCommit):
                    self.obs.queue_checkout_package(event.project, event.package)

                elif isinstance(event, hermes.HermesEventProjectDeleted):
                    # Even if there's a later commit to the same project (which
                    # is unlikely), we wouldn't know which packages are still
                    # relevant, so it's better to remove the project to not
                    # have unexisting packages in the database. The unlikely
                    # case will eat a bit more resources, but it's really
                    # unlikely to happen anyway.
                    self.obs.remove_checkout_project(event.project)

                elif isinstance(event, hermes.HermesEventPackageMeta):
                    # Note that the ObsCheckout object will automatically check out
                    # devel projects that have appeared via metadata change, if
                    # necessary.
                    self.obs.queue_checkout_package_meta(event.project, event.package)

                elif isinstance(event, hermes.HermesEventPackageAdded):
                    # The pkgmeta file of the project won't have anything about
                    # this package, so we need to download the metadata too.
                    self.obs.queue_checkout_package(event.project, event.package)
                    self.obs.queue_checkout_package_meta(event.project, event.package)

                elif isinstance(event, hermes.HermesEventPackageDeleted):
                    self.obs.remove_checkout_package(event.project, event.package)

                else:
                    raise RunnerException('Unhandled Hermes event type by mirror: %s' % event.__class__.__name__)

            for (project, package) in self._catchup:
                if package:
                    self.obs.queue_checkout_package(project, package)
                    self.obs.queue_checkout_package_meta(project, package)
                elif self.conf.allow_project_catchup:
                    self.obs.queue_checkout_project(project, force_simple_checkout=True)

        self.obs.run()


    def _run_db(self, conf_changed):
        """ Return if a full rebuild was done, and if anything has been updated. """
        if self.conf.skip_db:
            return (False, False)

        if (self.conf.force_db or not self.db.exists() or
            (not self.conf.force_hermes and (conf_changed or self._status['db'] == -1))):
            # The database doesn't exist, the configuration has changed, or
            # we don't have the whole list of events that have happened since
            # the last database update. So we just rebuild it from scratch.
            self.db.rebuild()

            return (True, True)
        else:
            # update the relevant parts of the db

            # reverse to have chronological order
            events = self.hermes.get_events(self._status['db'], reverse = True)

            if len(events) == 0 and len(self._catchup) == 0:
                return (False, False)

            changed = False

            for event in events:
                # ignore events that belong to a project we do not monitor
                # (ie, there's no checkout)
                project_dir = os.path.join(self._mirror_dir, event.project)
                if not os.path.exists(project_dir):
                    continue

                # do not handle packages that had an issue while mirroring
                if self._had_mirror_error(event.project, event.package):
                    continue

                changed = True

                if isinstance(event, hermes.HermesEventCommit):
                    self.db.update_package(event.project, event.package)

                elif isinstance(event, hermes.HermesEventProjectDeleted):
                    self.db.remove_project(event.project)

                elif isinstance(event, hermes.HermesEventPackageMeta):
                    # Note that the ObsDb object will automatically add the
                    # devel projects to the database, if necessary.
                    self.db.update_package(event.project, event.package)

                elif isinstance(event, hermes.HermesEventPackageAdded):
                    self.db.add_package(event.project, event.package)

                elif isinstance(event, hermes.HermesEventPackageDeleted):
                    self.db.remove_package(event.project, event.package)

                else:
                    raise RunnerException('Unhandled Hermes event type by database: %s' % event.__class__.__name__)

            db_projects = set(self.db.get_projects())

            for (project, package) in self._catchup:
                # do not handle packages that had an issue while mirroring
                if self._had_mirror_error(project, package):
                    continue

                if package:
                    if project not in db_projects:
                        print('Cannot handle %s/%s catchup: project is not part of our analysis' % (project, package), file=sys.stderr)
                        continue

                    changed = True
                    self.db.add_package(project, package)
                elif self.conf.allow_project_catchup:
                    self.db.update_project(project)

            return (False, changed)


    def _run_xml(self, changed_projects = None):
        """ Update XML files.

            changed_projects -- List of projects that we know will need an
                                update

        """
        if self.conf.skip_xml:
            return

        if self.conf.force_xml or self._status['xml'] == -1:
            changed_projects = None
        else:
            # adds projects that have changed, according to hermes

            if changed_projects is None:
                changed_projects = set()
            else:
                changed_projects = set(changed_projects)

            # Order of events does not matter here
            events = self.hermes.get_events(self._status['xml'])

            for event in events:
                # ignore events that belong to a project we do not monitor
                # (ie, there's no checkout)
                project_dir = os.path.join(self._mirror_dir, event.project)
                if not os.path.exists(project_dir):
                    continue

                # do not handle packages that had an issue while mirroring
                if self._had_mirror_error(event.project, event.package):
                    continue

                if isinstance(event, hermes.HermesEventCommit):
                    changed_projects.add(event.project)

                elif isinstance(event, hermes.HermesEventProjectDeleted):
                    # this will have been removed already, as stale data
                    pass

                elif isinstance(event, hermes.HermesEventPackageMeta):
                    changed_projects.add(event.project)

                elif isinstance(event, hermes.HermesEventPackageAdded):
                    changed_projects.add(event.project)

                elif isinstance(event, hermes.HermesEventPackageDeleted):
                    changed_projects.add(event.project)

                else:
                    raise RunnerException('Unhandled Hermes event type by XML generator: %s' % event.__class__.__name__)

            for (project, package) in self._catchup:
                # do not handle packages that had an issue while mirroring
                if self._had_mirror_error(project, package):
                    continue

                if package:
                    changed_projects.add(project)
                elif self.conf.allow_project_catchup:
                    changed_projects.add(project)

        self.xml.run(self.db.get_cursor(), changed_projects)


    def _remove_stale_data(self):
        if self.conf.skip_mirror and self.conf.skip_db and self.conf.skip_xml:
            return

        if self.conf.skip_db and not self.db.exists():
            # If there's no database, but we skip its creation, it's not a bug
            return

        # If one project exists in the database, but it's not an explicitly
        # requested project, nor a devel project that we should have, then we
        # can safely remove it from the mirror and from the database
        requested_projects = list(self.conf.projects.keys())

        needed = []
        for project in requested_projects:
            needed.append(project)
            if self.conf.projects[project].checkout_devel_projects:
                needed.extend(self.db.get_devel_projects(project))
        needed = set(needed)

        db_projects = set(self.db.get_projects())
        unneeded = db_projects.difference(needed)
        for project in unneeded:
            if not self.conf.skip_xml:
                self.xml.remove_project(project)
            if not self.conf.skip_db:
                self.db.remove_project(project)
            if not self.conf.skip_mirror:
                self.obs.remove_checkout_project(project)

        if self.conf.skip_mirror and self.conf.skip_xml:
            return

        # We now have "projects in the db" = needed
        db_projects = needed

        if not self.conf.skip_mirror and os.path.exists(self._mirror_dir):
            # If one project exists in the mirror but not in the db, then it's
            # stale data from the mirror that we can remove.
            mirror_projects = set([ subdir for subdir in os.listdir(self._mirror_dir) if os.path.isdir(subdir) ])
            unneeded = mirror_projects.difference(db_projects)
            for project in unneeded:
                self.obs.remove_checkout_project(project)

        if not self.conf.skip_xml and os.path.exists(self._xml_dir):
            # If one project exists in the xml but not in the db, then it's
            # stale data that we can remove.
            xml_projects = set([ file for file in os.listdir(self._xml_dir) if file.endswith('.xml') ])
            unneeded = xml_projects.difference(db_projects)
            for project in unneeded:
                self.xml.remove_project(project)


    def run(self):
        """ Run the various steps of the script."""
        # Get the previous status, and some info about what will be the new one
        self._read_status()

        if self.conf.filename:
            stats = os.stat(self.conf.filename)
            new_conf_mtime = stats.st_mtime
        else:
            new_conf_mtime = -1

        if self.conf.use_opensuse:
            new_opensuse_mtime = self.conf.get_opensuse_mtime()
        else:
            new_opensuse_mtime = -1

        conf_changed = (not self.conf.ignore_conf_mtime and
                        (self._status['conf-mtime'] != new_conf_mtime or
                        self._status['opensuse-mtime'] != new_opensuse_mtime))

        # keep in sync this boolean expression and the one used in _run_mirror
        if self.conf.no_full_check and (self._status['mirror'] == -1 or conf_changed):
            print('Full checkout check needed, but disabled by config.')
            return

        # Setup hermes, it will be call before the mirror update, depending on
        # what we need

        # We need at least what the mirror have, and we might need something a
        # bit older for the database or the xml (note that if we have no status
        # for them, we will just rebuild everything anyway)
        ids = [ self._status['mirror'] ]
        if self._status['db'] != -1:
            ids.append(self._status['db'])
        if self._status['xml'] != -1:
            ids.append(self._status['xml'])
        min_last_known_id = min(ids)

        self.hermes = hermes.HermesReader(min_last_known_id, self.conf.hermes_baseurl, self.conf.hermes_feeds, self.conf)

        self._setup_catchup()

        # Run the mirror update, and make sure to update the status afterwards
        # in case we crash later
        self.obs = buildservice.ObsCheckout(self.conf, self._mirror_dir)
        self._run_mirror(conf_changed)

        if not self.conf.mirror_only_new and not self.conf.skip_mirror:
            # we don't want to lose events if we went to fast mode once
            self._status['mirror'] = self.hermes.last_known_id
        self._write_mirror_error()
        self._write_status()

        # Update/create the upstream database
        self.upstream = upstream.UpstreamDb(self._upstream_dir, self._db_dir, self.conf.debug)
        if not self.conf.skip_upstream:
            self.upstream.update(self.conf.projects, self.conf.force_upstream)
        new_upstream_mtime = self.upstream.get_mtime()

        # Update/create the package database
        self.db = database.ObsDb(self.conf, self._db_dir, self._mirror_dir, self.upstream)
        (db_full_rebuild, db_changed) = self._run_db(conf_changed)

        if not self.conf.mirror_only_new and not self.conf.skip_db:
            # we don't want to lose events if we went to fast mode once
            self._status['db'] = self.hermes.last_known_id

        if not self.conf.skip_db and not self.conf.skip_upstream and not db_full_rebuild:
            # There's no point a looking at the upstream changes if we did a
            # full rebuild anyway
            projects_changed_upstream = self.db.upstream_changes(self._status['upstream-mtime'])
            self._status['upstream-mtime'] = new_upstream_mtime
        else:
            projects_changed_upstream = []

        # Prepare the creation of xml files
        self.xml = infoxml.InfoXml(self._xml_dir, self.conf.debug)

        # Post-analysis to remove stale data, or enhance the database
        self._remove_stale_data()

        if not self.conf.skip_db:
            if db_changed or projects_changed_upstream:
                self.db.post_analyze()
            else:
                self._debug_print('No need to run the post-analysis')

        # Create xml last, after we have all the right data
        if db_full_rebuild:
            # we want to generate all XML files for full rebuilds
            self._run_xml()
        else:
            self._run_xml(projects_changed_upstream)

        if not self.conf.skip_xml:
            # if we didn't skip the xml step, then we are at the same point as
            # the db
            self._status['xml'] = self._status['db']

        self._status['conf-mtime'] = new_conf_mtime
        self._status['opensuse-mtime'] = new_opensuse_mtime

        self._empty_catchup()
        self._move_mirror_error_to_catchup()
        self._write_status()


#######################################################################


def main(args):
    (args, options, conf) = shellutils.get_conf(args)
    if not conf:
        return 1

    if not shellutils.lock_run(conf):
        return 1

    runner = Runner(conf)

    retval = 1

    try:
        runner.run()
        retval = 0
    except Exception as e:
        if isinstance(e, (RunnerException, shellutils.ShellException, config.ConfigException, hermes.HermesException, database.ObsDbException, infoxml.InfoXmlException)):
            print(e, file=sys.stderr)
        else:
            traceback.print_exc()

    shellutils.unlock_run(conf)

    return retval


if __name__ == '__main__':
    try:
      ret = main(sys.argv)
      sys.exit(ret)
    except KeyboardInterrupt:
      pass
07070100000019000081A40000000000000000000000016548EB8C000015CA000000000000000000000000000000000000003700000000osc-plugin-collab-0.104+30/server/obs-db/shellutils.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2009-2010, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os
import sys

import errno
import optparse
import socket

import config
import util


#######################################################################


class ShellException(Exception):
    pass


#######################################################################


def get_conf(args, parser = None):
    if not parser:
        parser = optparse.OptionParser()

    parser.add_option('--config', dest='config',
                      help='configuration file to use')
    parser.add_option('--opensuse', dest='opensuse',
                      action='store_true', default=False,
                      help='use the openSUSE config as a basis')
    parser.add_option('--log', dest='log',
                      help='log file to use (default: stderr)')

    (options, args) = parser.parse_args()

    if options.log:
        path = os.path.realpath(options.log)
        util.safe_mkdir_p(os.path.dirname(path))
        sys.stderr = open(options.log, 'a')

    try:
        conf = config.Config(options.config, use_opensuse = options.opensuse)
    except config.ConfigException as e:
        print(e, file=sys.stderr)
        return (args, options, None)

    if conf.sockettimeout > 0:
        # we have a setting for the default socket timeout to not hang forever
        socket.setdefaulttimeout(conf.sockettimeout)

    try:
        os.makedirs(conf.cache_dir)
    except OSError as e:
        if e.errno != errno.EEXIST:
            print('Cannot create cache directory.', file=sys.stderr)
            return (args, options, None)

    return (args, options, conf)


#######################################################################


def read_status(filename, template):
    """ Read the last known status of the script. """
    result = template.copy()

    if not os.path.exists(filename):
        return result

    file = open(filename)
    lines = file.readlines()
    file.close()

    for line in lines:
        line = line[:-1]
        handled = False

        for key in list(result.keys()):
            if line.startswith(key + '='):
                value = line[len(key + '='):]
                try:
                    result[key] = int(value)
                except ValueError:
                    raise ShellException('Cannot parse status value for %s: %s' % (key, value))

            handled = True

        if not handled:
            raise ShellException('Unknown status line: %s' % (line,))

    return result


def write_status(filename, status_dict):
    """ Save the last known status of the script. """
    dirname = os.path.dirname(filename)
    if not os.path.exists(dirname):
        os.makedirs(dirname)

    tmpfilename = filename + '.new'

    # it's always better to have things sorted, since it'll be predictable
    # (so better for human eyes ;-))
    items = list(status_dict.items())
    items.sort()

    file = open(tmpfilename, 'w')
    for (key, value) in items:
        file.write('%s=%d\n' % (key, value))
    file.close()

    os.rename(tmpfilename, filename)


#######################################################################


def lock_run(conf, name = None):
    # FIXME: this is racy, we need a real lock file. Or use an atomic operation
    # like mkdir instead
    if name:
        running_file = os.path.join(conf.cache_dir, 'running-' + name)
    else:
        running_file = os.path.join(conf.cache_dir, 'running')

    if os.path.exists(running_file):
        print('Another instance of the script is running.', file=sys.stderr)
        return False

    open(running_file, 'w').write('')

    return True


def unlock_run(conf, name = None):
    if name:
        running_file = os.path.join(conf.cache_dir, 'running-' + name)
    else:
        running_file = os.path.join(conf.cache_dir, 'running')

    os.unlink(running_file)


#######################################################################
0707010000001A000081A40000000000000000000000016548EB8C000055D6000000000000000000000000000000000000003500000000osc-plugin-collab-0.104+30/server/obs-db/upstream.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2008-2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os
import sys

import re
import sqlite3
import time

from posixpath import join as posixjoin

import util

MATCH_CHANGE_NAME = ''
# FIXME: we hardcode this list of branches since, well, there's no better way to do that :/
BRANCHES_WITHOUT_PKG_MATCH = [ 'fallback', 'cpan', 'pypi' ]

#######################################################################

class UpstreamDb:

    def __init__(self, dest_dir, db_dir, debug = False):
        self.dest_dir = dest_dir
        self._debug = debug

        # we'll store an int, so let's use an int right now
        self._now = int(time.time())

        # keep in memory what we removed
        self._removed_matches = []
        # this is by branch
        self._removed_upstream = {}

        self._dbfile = os.path.join(db_dir, 'upstream.db')

        self.db = None
        self.cursor = None

    def _debug_print(self, s):
        """ Print s if debug is enabled. """
        if self._debug:
            print('UpstreamDb: %s' % s)

    def __del__(self):
        # needed for the commit
        self._close_db()

    def _open_db(self, create_if_needed = False):
        """ Open a database file, and sets up everything. """
        if self.db:
            return True

        create = False
        if not os.path.exists(self._dbfile):
            if not create_if_needed:
                return False
            else:
                util.safe_mkdir_p(os.path.dirname(self._dbfile))
                create = True

        self.db = sqlite3.connect(self._dbfile)
        self.db.row_factory = sqlite3.Row
        self.cursor = self.db.cursor()

        if create:
            self._sql_setup()

        return True

    def _close_db(self):
        """ Closes the currently open database. """
        if self.cursor:
            self.cursor.close()
            self.cursor = None
        if self.db:
            self.db.commit()
            self.db.close()
            self.db = None

    def _sql_setup(self):
        self.cursor.execute('''CREATE TABLE upstream_pkg_name_match (
            id INTEGER PRIMARY KEY,
            srcpackage TEXT,
            upstream TEXT,
            updated INTEGER
            );''')
        self.cursor.execute('''CREATE TABLE upstream (
            id INTEGER PRIMARY KEY,
            branch INTEGER,
            name TEXT,
            version TEXT,
            url TEXT,
            updated INTEGER
            );''')
        self.cursor.execute('''CREATE TABLE branches (
            id INTEGER PRIMARY KEY,
            branch TEXT,
            mtime INTEGER
            );''')

    def _is_line_comment(self, line):
        return line[0] == '#' or line.strip() == ''

    def _update_upstream_pkg_name_match(self, matchfile):
        matchpath = os.path.join(self.dest_dir, matchfile)

        self.cursor.execute('''SELECT * FROM upstream_pkg_name_match;''')
        oldmatches = {}
        for row in self.cursor:
            oldmatches[row['srcpackage']] = (row['id'], row['upstream'])

        if not os.path.exists(matchpath):
            print('No upstream/package name match database available, keeping previous data.', file=sys.stderr)
            return

        handled = []

        file = open(matchpath)
        re_names = re.compile('^(.+):(.*)$')
        while True:
            line = file.readline()

            if len(line) == 0:
                break
            if self._is_line_comment(line):
                continue

            match = re_names.match(line)
            if not match:
                continue

            upstream = match.group(1)
            if match.group(2) != '':
                srcpackage = match.group(2)
            else:
                srcpackage = upstream

            if srcpackage in handled:
                print('Source package %s defined more than once in %s.' % (srcpackage, matchfile), file=sys.stderr)
            elif srcpackage in oldmatches:
                # Update the entry if it has changed
                (id, oldupstream) = oldmatches[srcpackage]
                if oldupstream != upstream:
                    # Note: we don't put the mtime here, since we use the
                    # updated time in get_changed_packages
                    self.cursor.execute('''UPDATE upstream_pkg_name_match SET
                        upstream = ?, updated = ?
                        WHERE id = ?
                        ;''',
                        (upstream, self._now, id))
                del oldmatches[srcpackage]
                handled.append(srcpackage)
            else:
                # Add the entry
                self.cursor.execute('''INSERT INTO upstream_pkg_name_match VALUES (
                    NULL, ?, ?, ?
                    );''',
                    (srcpackage, upstream, self._now))
                handled.append(srcpackage)

        file.close()

        # Remove matches that were removed in the source file
        if len(oldmatches) > 0:
            ids = [ id for (id, oldupstream) in list(oldmatches.values()) ]
            where = ' OR '.join([ 'id = ?' for i in range(len(ids)) ])
            self.cursor.execute('''DELETE FROM upstream_pkg_name_match WHERE %s;''' % where, ids)
            #  will be used in get_changed_packages()
            self._removed_matches = list(oldmatches.keys())
        else:
            self._removed_matches = []

    def _get_upstream_name_branches(self):
        result = {}

        self.cursor.execute('''SELECT upstream FROM upstream_pkg_name_match WHERE upstream LIKE "%|%"''')
        for row in self.cursor:
            name_branch = row['upstream']
            index = name_branch.find('|')
            name = name_branch[:index]
            limit = name_branch[index + 1:]
            item = (name_branch, limit)

            if name in result:
                name_branches = result[name]
                name_branches.append(item)
                result[name] = name_branches
            else:
                result[name] = [ item ]

        return result

    def _get_branch_data(self, branch):
        self.cursor.execute('''SELECT id, mtime FROM branches WHERE
            branch = ?;''', (branch,))
        row = self.cursor.fetchone()
        if row:
            return (row['id'], row['mtime'])
        else:
            return ('', '')

    def _update_upstream_data(self, branch, upstream_name_branches):
        branch_path = os.path.join(self.dest_dir, branch)

        if not os.path.exists(branch_path):
            print('No file %s available for requested branch %s, keeping previous data if available.' % (branch_path, branch), file=sys.stderr)
            return

        (branch_id, branch_mtime) = self._get_branch_data(branch)
        stats = os.stat(branch_path)
        new_mtime = stats.st_mtime

        if not branch_id:
            # the branch does not exist, add it
            self.cursor.execute('''INSERT INTO branches VALUES (
                NULL, ?, ?
                );''',
                (branch, new_mtime))
            self.cursor.execute('''SELECT last_insert_rowid();''')
            branch_id = self.cursor.fetchone()[0]
        else:
            # do not update anything if the file has not changed
            if branch_mtime >= new_mtime:
                return
            # else update the mtime
            self.cursor.execute('''UPDATE branches SET
                mtime = ? WHERE id = ?;''',
                (new_mtime, branch_id))

        self.cursor.execute('''SELECT * FROM upstream WHERE branch = ?;''', (branch_id,))
        olddata = {}
        for row in self.cursor:
            olddata[row['name']] = (row['id'], row['version'], row['url'])

        # upstream data, after we've converted the names to branch names if
        # needed. For instance, glib:1.2.10 will translate to the "glib|1.3"
        # name but also to the "glib" name if it doesn't exist yet or if the
        # version there is lower than 1.2.10.
        real_upstream_data = {}

        re_upstream_data = re.compile('^([^:]*):([^:]+):([^:]+):(.*)$')

        file = open(branch_path)
        while True:
            line = file.readline()

            if len(line) == 0:
                break
            if self._is_line_comment(line):
                continue
            line = line[:-1]

            match = re_upstream_data.match(line)
            if not match:
                continue

            name = match.group(2)
            version = match.group(3)

            if match.group(1) == 'fallback':
                url = ''
            elif match.group(1) == 'nonfgo':
                url = match.group(4)
            elif match.group(1) == 'upstream':
                url = ''
            elif match.group(1) == 'cpan':
                url = posixjoin('http://cpan.perl.org/CPAN/authors/id/', match.group(4))
            elif match.group(1) == 'pypi':
                url = match.group(4)
            elif match.group(1) == 'fgo':
                versions = version.split('.')
                if len(versions) == 1:
                    majmin = version
                elif int(versions[0]) >= 40:
                    majmin = versions[0]
                else:
                    majmin = versions[0] + '.' + versions[1]
                url = 'https://download.gnome.org/sources/%s/%s/%s-%s.tar.xz' % (name, majmin, name, version)
            else:
                print('Unknown upstream group for metadata: %s (full line: \'%s\').' % (match.group(1), line), file=sys.stderr)
                url = ''

            ignore = False
            if name in real_upstream_data:
                (current_version, current_url) = real_upstream_data[name]
                if util.version_ge(current_version, version):
                    ignore = True

            if not ignore:
                real_upstream_data[name] = (version, url)

            # Now also fill data for 'glib|1.2.10' if it fits
            if name in upstream_name_branches:
                # name = 'glib', upstream_name_branch = 'glib|1.2.10'
                # and limit = '1.2.10'
                for (upstream_name_branch, limit) in upstream_name_branches[name]:
                    if upstream_name_branch in real_upstream_data:
                        (current_version, current_url) = real_upstream_data[upstream_name_branch]
                        if util.version_ge(current_version, version):
                            continue

                    if util.version_ge(version, limit):
                        continue

                    real_upstream_data[upstream_name_branch] = (version, url)


        for (name, (version, url)) in list(real_upstream_data.items()):
            if name in olddata:
                # Update the entry if it has changed
                (id, oldversion, oldurl) = olddata[name]
                if oldversion != version or oldurl != url:
                    # Note: we don't put the mtime here, since we use the
                    # updated time in get_changed_packages
                    self.cursor.execute('''UPDATE upstream SET
                        version = ?, url = ?, updated = ?
                        WHERE id = ?
                        ;''',
                        (version, url, self._now, id))
                del olddata[name]
            else:
                # Add the entry
                self.cursor.execute('''INSERT INTO upstream VALUES (
                    NULL, ?, ?, ?, ?, ?
                    );''',
                    (branch_id, name, version, url, self._now))

        file.close()

        # Remove data that was removed in the source file
        if len(olddata) > 0:
            ids = [ id for (id, version, url) in list(olddata.values()) ]
            # Delete by group of 50, since it once had to remove ~1800 items
            # and it didn't work fine
            chunk_size = 50
            ids_len = len(ids)
            for index in range(int(ids_len / chunk_size)):
                chunk_ids = ids[index * chunk_size : (index + 1) * chunk_size]
                where = ' OR '.join([ 'id = ?' for i in range(len(chunk_ids)) ])
                self.cursor.execute('''DELETE FROM upstream WHERE %s;''' % where, chunk_ids)
            remainder = ids_len % chunk_size
            if remainder > 0:
                chunk_ids = ids[- remainder:]
                where = ' OR '.join([ 'id = ?' for i in range(len(chunk_ids)) ])
                self.cursor.execute('''DELETE FROM upstream WHERE %s;''' % where, chunk_ids)

            self._removed_upstream[branch] = list(olddata.keys())
        else:
            self._removed_upstream[branch] = []

    def _remove_old_branches(self, branches):
        self.cursor.execute('''SELECT * FROM branches;''')
        for row in self.cursor:
            branch = row['branch']
            if not branch in branches:
                id = row['id']
                self.cursor.execute('''DELETE FROM upstream WHERE branch = ?;''', (id,))
                self.cursor.execute('''DELETE FROM branches WHERE id = ?;''', (id,))

    def _is_without_upstream(self, name):
        index = name.rfind('branding')
        if index > 0:
            return name[index:] in ['branding-openSUSE', 'branding-SLED', 'branding-SLES']
        return False

    def _get_upstream_name(self, srcpackage):
        self.cursor.execute('''SELECT upstream FROM upstream_pkg_name_match WHERE
            srcpackage = ?;''', (srcpackage,))
        row = self.cursor.fetchone()
        if row:
            return row[0]
        else:
            return ''

    def _exist_in_branch_from_db(self, branch, name):
        (branch_id, branch_mtime) = self._get_branch_data(branch)
        if not branch_id:
            return False

        self.cursor.execute('''SELECT name FROM upstream WHERE
            name = ? AND branch = ?;''', (name, branch_id))
        row = self.cursor.fetchone()
        if row:
            return True
        else:
            return False

    def exists_in_branches(self, branches, srcpackage):
        if not self._open_db():
            return False

        name = self._get_upstream_name(srcpackage)

        for branch in branches:
            if branch in BRANCHES_WITHOUT_PKG_MATCH:
                query_name = srcpackage
            else:
                query_name = name

            if query_name and self._exist_in_branch_from_db(branch, query_name):
                return True

        return False

    def _get_data_from_db(self, branch, name):
        (branch_id, branch_mtime) = self._get_branch_data(branch)
        if not branch_id:
            return ('', '')

        self.cursor.execute('''SELECT version, url FROM upstream WHERE
            name = ? AND branch = ?;''', (name, branch_id))
        row = self.cursor.fetchone()
        if row:
            return (row[0], row[1])
        else:
            return ('', '')

    def get_upstream_data(self, branches, srcpackage):
        if not self._open_db():
            return ('', '', '')

        name = self._get_upstream_name(srcpackage)

        (version, url) = ('', '')

        for branch in branches:
            if branch in BRANCHES_WITHOUT_PKG_MATCH:
                query_name = srcpackage
            else:
                query_name = name

            if query_name:
                (version, url) = self._get_data_from_db(branch, query_name)
            if version:
                break

        if not version:
            if self._is_without_upstream(srcpackage):
                version = '--'
            else:
                version = ''

        return (name, version, url)

    def get_mtime(self):
        if not self._open_db():
            return -1

        self.cursor.execute('''SELECT MAX(updated) FROM upstream_pkg_name_match;''')
        max_match = self.cursor.fetchone()[0]
        self.cursor.execute('''SELECT MAX(updated) FROM upstream;''')
        max_data = self.cursor.fetchone()[0]
        if not isinstance(max_data, int):
            max_data = 0
        return max(max_match, max_data)

    def get_changed_packages(self, old_mtime):
        if not self._open_db():
            return {}

        changed = {}

        self.cursor.execute('''SELECT srcpackage FROM upstream_pkg_name_match
                    WHERE updated > ?;''', (old_mtime,))
        changed[MATCH_CHANGE_NAME] = [ row['srcpackage'] for row in self.cursor ]
        changed[MATCH_CHANGE_NAME].extend(self._removed_matches)

        self.cursor.execute('''SELECT id, branch FROM branches;''')
        branches = []
        for (id, branch) in self.cursor:
            branches.append((id, branch))
            if branch in self._removed_upstream:
                changed[branch] = self._removed_upstream[branch]
            else:
                changed[branch] = []

        # Doing a joint query is slow, so we do a cache first
        match_cache = {}
        self.cursor.execute('''SELECT srcpackage, upstream FROM upstream_pkg_name_match;''')
        for (srcpackage, upstream) in self.cursor:
            if upstream in match_cache:
                match_cache[upstream].append(srcpackage)
            else:
                match_cache[upstream] = [ srcpackage ]


        for (id, branch) in branches:
            # Joint query that is slow
            #self.cursor.execute('''SELECT A.srcpackage
            #            FROM upstream_pkg_name_match as A, upstream as B
            #            WHERE B.updated > ? AND B.name = A.upstream AND B.branch = ?;''', (old_mtime, id))
            #changed[branch].extend([ row['srcpackage'] for row in self.cursor ])
            self.cursor.execute('''SELECT name FROM upstream
                        WHERE updated > ? AND branch = ?;''', (old_mtime, id))
            if branch in BRANCHES_WITHOUT_PKG_MATCH:
                for (name,) in self.cursor:
                    changed[branch].append(name)
            else:
                for (name,) in self.cursor:
                    if name in match_cache:
                        changed[branch].extend(match_cache[name])

        self._debug_print('%d upstream(s) changed' % sum([ len(i) for i in list(changed.values()) ]))

        for branch in list(changed.keys()):
            if not changed[branch]:
                del changed[branch]

        return changed

    def update(self, project_configs, rebuild = False):
        if rebuild:
            self._close_db()
            if os.path.exists(self._dbfile):
                os.unlink(self._dbfile)

        self._open_db(create_if_needed = True)

        self._update_upstream_pkg_name_match('upstream-packages-match.txt')

        upstream_name_branches = self._get_upstream_name_branches()

        branches = []
        for project in list(project_configs.keys()):
            branches.extend(project_configs[project].branches)
        branches = set(branches)

        for branch in branches:
            if branch:
                self._update_upstream_data(branch, upstream_name_branches)

        self._remove_old_branches(branches)

        self.db.commit()

#######################################################################


def main(args):
    class ProjectConfig:
        def __init__(self, branch):
            self.branch = branch

    configs = {}
    configs['gnome-2.32'] = ProjectConfig('gnome-2.32')
    configs['latest'] = ProjectConfig('latest')

    upstream = UpstreamDb('/tmp/obs-dissector/cache/upstream', '/tmp/obs-dissector/tmp')
    upstream.update(configs)

    print('glib (latest): %s' % (upstream.get_upstream_data('latest', 'glib', True),))
    print('glib2 (latest): %s' % (upstream.get_upstream_data('latest', 'glib2', True),))
    print('gtk2 (2.32): %s' % (upstream.get_upstream_data('gnome-2.32', 'gtk2', True),))
    print('gtk2 (latest): %s' % (upstream.get_upstream_data('latest', 'gtk2', True),))
    print('gtk3 (latest): %s' % (upstream.get_upstream_data('latest', 'gtk3', True),))
    print('gobby04 (latest): %s' % (upstream.get_upstream_data('latest', 'gobby04', True),))
    print('gobby (latest): %s' % (upstream.get_upstream_data('latest', 'gobby', True),))
    print('OpenOffice_org (latest, fallback): %s' % (upstream.get_upstream_data('latest', 'OpenOffice_org', False),))


if __name__ == '__main__':
    try:
      main(sys.argv)
    except KeyboardInterrupt:
      pass
0707010000001B000081A40000000000000000000000016548EB8C00000E8E000000000000000000000000000000000000003100000000osc-plugin-collab-0.104+30/server/obs-db/util.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2008-2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os

import errno

def safe_mkdir(dir):
    if not dir:
        return

    try:
        os.mkdir(dir)
    except OSError as e:
        if e.errno != errno.EEXIST:
            raise e

def safe_mkdir_p(dir):
    if not dir:
        return

    try:
        os.makedirs(dir)
    except OSError as e:
        if e.errno != errno.EEXIST:
            raise e

def safe_unlink(filename):
    """ Unlink a file, but ignores the exception if the file doesn't exist. """
    try:
        os.unlink(filename)
    except OSError as e:
        if e.errno != errno.ENOENT:
            raise e


########################################################


# comes from convert-to-tarball.py
def _strict_bigger_version(a, b):
    a_nums = a.split('.')
    b_nums = b.split('.')
    num_fields = min(len(a_nums), len(b_nums))
    for i in range(0,num_fields):
        if   int(a_nums[i]) > int(b_nums[i]):
            return a
        elif int(a_nums[i]) < int(b_nums[i]):
            return b
    if   len(a_nums) > len(b_nums):
        return a
    elif len(a_nums) < len(b_nums):
        return b
    else:
        return None


def bigger_version(a, b):
    # We compare versions this way (with examples):
    #   + 0.3 and 0.3.1:
    #     0.3.1 wins: 0.3 == 0.3 and 0.3.1 has another digit
    #   + 0.3-1 and 0.3-2:
    #     0.3-2 wins: 0.3 == 0.3 and 1 < 2
    #   + 0.3.1-1 and 0.3-2:
    #     0.3.1-1 wins: 0.3.1 > 0.3
    a_nums = a.split('-')
    b_nums = b.split('-')
    num_fields = min(len(a_nums), len(b_nums))
    for i in range(0,num_fields):
        bigger = _strict_bigger_version(a_nums[i], b_nums[i])
        if   bigger == a_nums[i]:
            return a
        elif bigger == b_nums[i]:
            return b
    if len(a_nums) > len(b_nums):
        return a
    else:
        return b


def version_gt(a, b):
    if a == b:
        return False

    bigger = bigger_version(a, b)
    return a == bigger

def version_ge(a, b):
    if a == b:
        return True

    bigger = bigger_version(a, b)
    return a == bigger
0707010000001C000041ED0000000000000000000000026548EB8C00000000000000000000000000000000000000000000003100000000osc-plugin-collab-0.104+30/server/openSUSE-setup0707010000001D000081A40000000000000000000000016548EB8C00000CE9000000000000000000000000000000000000003800000000osc-plugin-collab-0.104+30/server/openSUSE-setup/READMESetup of osc-collab server for the openSUSE project
===================================================

The server runs on the osc-collab virtual machine.

When updating the code or data in git, you will need to update the files on
the virtual machine too. We could automatically update the code from git, but
it wouldn't be very safe to do unattended updates without looking at the
changes in git.

Just ask one of the osc-collab admins to do the following on the osc-collab
virtual machine:

  # check changes in git are fine, if yes then update the code
  su - opensuse-scripts
  cd ~/src/osc-plugin-collab; git pull


High-level overview
===================

The runme scripts are run by wrapper scripts that cron will execute every 20
minutes. The order in which these wrapper scripts are run matters, as the data
generated by one script can be used by another. The recommended order is:

 1. getting upstream data
 2. update the database
 3. modify OBS attributes

See the crontab file for a potential setup.

The wrapper scripts live in the cron-scripts/ subdirectory. There is one wrapper
script for each task, and a common file for shared functions/settings.

The wrapper scripts will make use of a obs.conf configuration file. See the
collab-data/ subdirectory for an example.

Log files from the log wrappers will be located in the cron-scripts/logs/
subdirectory.


Configuration that has to be done
=================================

Obviously, osc must be installed and configured for the specific user that will
be used by the scripts. It is also recommended to have the python-lxml module
installed, for performance reasons. Other required packages include: bc, curl,
dotlockfile or withlock.

Then, several files need to be updated:

crontab:
  The MAILTO variable contains the mail address that will receive errors.

collab-data/obs.conf:
  The cache-dir variable should be updated to reflect where the cache directory
  will be.

cron-scripts/common:
  Several variables control the configuration of the wrapper scripts:
    WATCHDOG_CMD: command for a script that will kill processes that are stuck.
                  This can be ignored if needed.
    COLLAB_DATA_DIR: directory that contains obs.conf and the cache
                     subdirectory. Must be set.
    OBS_CONF: can be used to override the location of obs.conf. Must be set, but
              suggested value should be correct.
    OSC_PLUGIN_COLLAB_DIR: directory that contains the osc-plugin-collab code.
                           Must be set.
    OBS_UPLOAD_URL: URL to obs-upload.py from the web API. If left empty, the
                    obs.db database won't get uploaded. This is generally fine
                    for a setup where the web server lives on the same machine
                    as the cron scripts.


Note for osc-collab admins
==========================

The major task for osc-collab admins is to make sure the server is correctly
configured, and the osc-collab server code runs fine.

The infrastructure team will handle maintenance of the virtual machine. Non
interactive updates are usually automatically installed, and kernel updates are
manually dealt with.

A new osc-collab admin should contact the openSUSE infrastructure team so that
the team knows about the new osc-collab admin.
0707010000001E000041ED0000000000000000000000026548EB8C00000000000000000000000000000000000000000000003D00000000osc-plugin-collab-0.104+30/server/openSUSE-setup/collab-data0707010000001F000081A40000000000000000000000016548EB8C0000015A000000000000000000000000000000000000004D00000000osc-plugin-collab-0.104+30/server/openSUSE-setup/collab-data/osc-collab.conf[General]
cache-dir = /var/lib/osc-collab/.cache
ignore-conf-mtime = True
no-full-check = True
allow-project-catchup = False

[Debug]
debug = False
#mirror-only-new = True
#force-hermes = True
#force-upstream = True
#force-db = True
#force-xml = True
skip-hermes = True
#skip-mirror = True
#skip-upstream = True
#skip-db = True
#skip-xml = True

07070100000020000041ED0000000000000000000000026548EB8C00000000000000000000000000000000000000000000003E00000000osc-plugin-collab-0.104+30/server/openSUSE-setup/cron-scripts07070100000021000081A40000000000000000000000016548EB8C000008A6000000000000000000000000000000000000004500000000osc-plugin-collab-0.104+30/server/openSUSE-setup/cron-scripts/commonsetup() {
	MAXTIME=$1

	if test "x$MAXTIME" = "x"; then
		echo "Maximum execution time for $0 is not set."
		exit 1
	fi

	# the PATH is set to the strict minimum in cronjobs
	export PATH=${HOME}/bin:$PATH

	NAME=`basename $0`
	TOP_DIR=`dirname $0`
	LOGFILE="${HOME}/.local/osc-collab/logs/$NAME.log"
	LOCKFILE="${HOME}/.local/osc-collab/$NAME.lock"

	### Values to setup

	## A helper that will kill the process it launches after $MAXTIME, to
	## avoid stuck processes.
	#WATCHDOG_CMD="${TOP_DIR}/excubitor -d $MAXTIME --"

	## Common directories / files
	COLLAB_DATA_DIR="${HOME}/.cache"
	COLLAB_INSTALL_DIR="/usr/share/osc-collab-server"
	OBS_CONF="/etc/osc-collab.conf"
	OSC_PLUGIN_COLLAB_DIR="${COLLAB_INSTALL_DIR}"

	## OBS_UPLOAD_URL should stay empty if there is no need to upload the
	## resulting obs.db anywhere
	#OBS_UPLOAD_URL=

	if test -z "$COLLAB_DATA_DIR"; then
		echo "COLLAB_DATA_DIR is not set."
		exit 1
	fi

	if test -z "$OBS_CONF"; then
		echo "OBS_CONF is not set."
		exit 1
	fi

	if test -z "$OBS_CONF"; then
		echo "OSC_PLUGIN_COLLAB_DIR is not set."
		exit 1
	fi

	LOCK_CMD=
	HAVE_DOTLOCKFILE=0
	HAVE_WITHLOCK=0
	which dotlockfile &> /dev/null && HAVE_DOTLOCKFILE=1
	which withlock &> /dev/null && HAVE_WITHLOCK=1

	if test "$HAVE_DOTLOCKFILE" -eq 1; then
		# we need a lock to avoid concurrent instances
		# Note that with -p, we won't lock if the process that created the lock
		# file does not exist anymore
		dotlockfile -p -l -r 0 $LOCKFILE

		if test $? -ne 0; then
			exit
		fi
	elif test "$HAVE_WITHLOCK" -eq 1; then
		LOCK_CMD="withlock $LOCKFILE"
	else
		echo "No lock program available; dotlockfile or withlock must be installed."
		exit 1
	fi

	if test -f $LOGFILE; then
		SIZE=`du -s $LOGFILE | cut -f 1`
		if test $SIZE -gt 200000; then
			today=`date +%Y%m%d`
			mv $LOGFILE $LOGFILE.$today
			gzip $LOGFILE.$today
		fi
	else
		mkdir -p `dirname $LOGFILE`
	fi

	PRE_CMD="${LOCK_CMD} ${WATCHDOG_CMD}"

	echo "=== Start (`date`) ===" >> $LOGFILE
}

cleanup() {
	echo "=== End (`date`) ===" >> $LOGFILE

	if test "$HAVE_DOTLOCKFILE" -eq 1; then
		if test "x$LOCKFILE" = "x"; then
			echo "Internal error: LOCKFILE is not set."
			exit 1
		fi

		dotlockfile -u $LOCKFILE
	fi
}
07070100000022000081ED0000000000000000000000016548EB8C00000169000000000000000000000000000000000000004D00000000osc-plugin-collab-0.104+30/server/openSUSE-setup/cron-scripts/run-attributes#!/bin/sh

TOP_DIR=`dirname $0`

if test ! -f "${TOP_DIR}/common"; then
	echo "No common infrastructure available."
	exit 1
fi

. "${TOP_DIR}/common"

# 30 minutes max
setup 1800

${PRE_CMD} "${OSC_PLUGIN_COLLAB_DIR}/server/obs-db/runme-attributes" -o "${OBS_CONF}" -s -l $LOGFILE

if test $? -ne 0; then
	echo "Error during the attributes update."
fi

cleanup
07070100000023000081ED0000000000000000000000016548EB8C0000012F000000000000000000000000000000000000005100000000osc-plugin-collab-0.104+30/server/openSUSE-setup/cron-scripts/run-gnome-versions#!/bin/sh

TOP_DIR=`dirname $0`

if test ! -f "${TOP_DIR}/common"; then
	echo "No common infrastructure available."
	exit 1
fi

. "${TOP_DIR}/common"

# 20 minutes max
setup 1200

${PRE_CMD} "${OSC_PLUGIN_COLLAB_DIR}/server/upstream/gnome-versions/update-versions ${COLLAB_DATA_DIR}/upstream/"

cleanup
07070100000024000081ED0000000000000000000000016548EB8C000002AD000000000000000000000000000000000000004600000000osc-plugin-collab-0.104+30/server/openSUSE-setup/cron-scripts/run-obs#!/bin/sh

TOP_DIR=`dirname $0`

if test ! -f "${TOP_DIR}/common"; then
	echo "No common infrastructure available."
	exit 1
fi

. "${TOP_DIR}/common"

# 10 minutes max -- for an update with hermes, that should be more than enough
setup 600

${PRE_CMD} "${OSC_PLUGIN_COLLAB_DIR}/server/obs-db/runme" -o "${OBS_CONF}" -s -l $LOGFILE

if test $? -eq 0; then
	if test -n "${OBS_UPLOAD_URL}"; then
		curl --silent --show-error -F destfile=obs.db -F dbfile="@${COLLAB_DATA_DIR}/cache/db/obs.db" ${OBS_UPLOAD_URL}
	fi
else
	if test -n "${OBS_UPLOAD_URL}"; then
		echo "Error during the database update, database not uploaded."
	else
		echo "Error during the database update."
	fi
fi

cleanup
07070100000025000081ED0000000000000000000000016548EB8C00000115000000000000000000000000000000000000004B00000000osc-plugin-collab-0.104+30/server/openSUSE-setup/cron-scripts/run-upstream#!/bin/sh

TOP_DIR=`dirname $0`

if test ! -f "${TOP_DIR}/common"; then
	echo "No common infrastructure available."
	exit 1
fi

. "${TOP_DIR}/common"

# 10 minutes max
setup 600

${PRE_CMD} "${OSC_PLUGIN_COLLAB_DIR}/server/upstream/runme" -o "${OBS_CONF}" -l $LOGFILE

cleanup
07070100000026000081A40000000000000000000000016548EB8C000002B8000000000000000000000000000000000000003900000000osc-plugin-collab-0.104+30/server/openSUSE-setup/crontabMAILTO="[email protected]"

## Note: we don't use */20 to control our timing, especially as the order in
## which scripts are run matters: run-upstream should be run before run-obs,
## which should be run before run-attributes.

# Update upstream data every twenty minutes
1,21,41   *       *       *       *       ${HOME}/src/cron-scripts/run-upstream
# Update obs db every twenty minutes
6,26,46 *       *       *       *       ${HOME}/src/cron-scripts/run-obs
# Update obs attributes every twenty minutes
9,29,49   *       *       *       *       ${HOME}/src/cron-scripts/run-attributes

## This can be used to automatically update from git
##@daily  ${HOME}/src/cron-scripts/run-updategit
07070100000027000081ED0000000000000000000000016548EB8C000001E2000000000000000000000000000000000000004300000000osc-plugin-collab-0.104+30/server/openSUSE-setup/osc-collab-runner#!/bin/sh

LIBEXEC=/usr/lib/osc-collab-server

# self-heal if runaway task detected: none can run longer than a day
find /var/lib/osc-collab/.cache -name running -mtime +1 -delete -print

# Find out what the latest gnome-versions are
${LIBEXEC}/run-gnome-versions

# Merge gnome-versions and find other upstream versions; update upstream.db
${LIBEXEC}/run-upstream

# Update obs.db: sync with changes in OBS
${LIBEXEC}/run-obs

# Update attributes in OBS
${LIBEXEC}/run-attributes

07070100000028000041ED0000000000000000000000026548EB8C00000000000000000000000000000000000000000000002B00000000osc-plugin-collab-0.104+30/server/upstream07070100000029000081ED0000000000000000000000016548EB8C00001515000000000000000000000000000000000000004200000000osc-plugin-collab-0.104+30/server/upstream/download-cpan-versions#!/usr/bin/env python3
# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2012, Novell, Inc.
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA  02110-1301
# USA
#
# (Licensed under the LGPLv2.1 or later)
#
#
# Authors: Vincent Untz <[email protected]>
#

import os
import socket
import sys
import time

import io
import gzip
import optparse
import urllib.request, urllib.error, urllib.parse

from util import *


PACKAGE_DETAILS = 'http://www.cpan.org/modules/02packages.details.txt.gz'


#######################################################################


def parse_cpan_details():
    tarballs = {}
    last_updated = ''

    stream = urllib.request.urlopen(PACKAGE_DETAILS)

    gzipper = gzip.GzipFile(fileobj=stream)
    in_classes_data = False

    while True:
        line = gzipper.readline()
        if not line:
            break

        line = line.strip()
        if not line:
            # An empty line is what separate the global metadata from the details
            # about all classes
            if not in_classes_data:
                in_classes_data = True
            continue

        # Skip comments
        if line.startswith(b'#'):
            continue

        # Global metadata about the details
        if not in_classes_data:
            if line.startswith(b'Last-Updated:'):
                last_updated = line[len('Last-Updated:'):].strip()
            continue

        ## Parse data about classes
        # We only keep the first class for a given tarball (it's the more generic one)
        # We ignore data when there's no version
        data = line.split()
        if len(data) != 3:
            print('Cannot parse line: %s' % line, file=sys.stderr)
            continue

        (perl_class, version, tarball) = data
        if version == 'undef':
            continue

        if tarball in tarballs:
            continue

        tarballs[tarball] = (perl_class, version)

    gzipper.close()

    return (last_updated, tarballs)


def perl_class_to_package(perl_class):
    return b'perl-' + perl_class.replace(b'::', b'-')


#######################################################################


def main(args):
    parser = optparse.OptionParser()

    parser.add_option('--debug', dest='debug',
                      help='only handle the argument as input and output the result')
    parser.add_option('--log', dest='log',
                      help='log file to use (default: stderr)')
    parser.add_option('--directory', dest='dir', default='.',
                      help='directory where to find data and save data')
    parser.add_option('--save-file', dest='save',
                      help='path to the file where the results will be written')
    parser.add_option('--only-if-old', action='store_true',
                      default=False, dest='only_if_old',
                      help='execute only if the pre-existing result file is older than 10 hours')

    (options, args) = parser.parse_args()

    directory = options.dir

    if options.log:
        path = os.path.realpath(options.log)
        safe_mkdir_p(os.path.dirname(path))
        sys.stderr = open(options.log, 'a')

    if options.debug:
        lines = [ options.debug + '\n' ]
        out = sys.stdout

    else:
        if options.save:
            save_file = options.save
        else:
            save_file = os.path.join(directory, 'versions-cpan')

        if os.path.exists(save_file):
            if not os.path.isfile(save_file):
                print('Save file %s is not a regular file.' % save_file, file=sys.stderr)
                return 1
            if options.only_if_old:
                stats = os.stat(save_file)
                # Quit if it's less than 12-hours old
                if time.time() - stats.st_mtime < 3600 * 12:
                    return 2

        else:
            safe_mkdir_p(os.path.dirname(save_file))

        out = open(save_file, 'w')

    # The default timeout is just too long. Use 10 seconds instead.
    socket.setdefaulttimeout(10)

    ret = 1

    try:
        (last_updated, tarballs) = parse_cpan_details()
    except urllib.error.URLError as e:
        print('Error when downloading CPAN metadata: %s' % e, file=sys.stderr)
    except urllib.error.HTTPError as e:
        print('Error when downloading CPAN metadata: server sent %s' % e, file=sys.stderr)
    else:
        for (tarball, (perl_class, version)) in tarballs.items():
            out.write('cpan:%s:%s:%s\n' % (perl_class_to_package(perl_class), version, tarball))
        ret = 0

    if not options.debug:
        out.close()

    return ret


if __name__ == '__main__':
    try:
      ret = main(sys.argv)
      sys.exit(ret)
    except KeyboardInterrupt:
      pass
0707010000002A000081ED0000000000000000000000016548EB8C000011A7000000000000000000000000000000000000004600000000osc-plugin-collab-0.104+30/server/upstream/download-fallback-versions#!/usr/bin/env python3
# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2012, Novell, Inc.
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA  02110-1301
# USA
#
# (Licensed under the LGPLv2.1 or later)
#
#
# Authors: Vincent Untz <[email protected]>
#

import os
import socket
import sys
import time

import optparse
import urllib.request, urllib.error, urllib.parse

from util import *


FALLBACK_URL = 'http://users.suse.com/~meissner/factory.upstream.lst'


#######################################################################


def parse_fallback_data():
    metadatas = []

    fd = urllib.request.urlopen(FALLBACK_URL)

    while True:
        line = fd.readline()
        if not line:
            break

        line = line.strip()
        if not line or line.startswith(b'#'):
            continue

        data = line.split(b',')
        if len(data) < 2:
            print('Cannot parse fallback line: %s' % line, file=sys.stderr)
            continue

        name = data[0]
        version = data[1]

        if not name or not version:
            print('Fallback line with nothing useful: %s' % line, file=sys.stderr)
            continue

        metadatas.append((name, version))

    fd.close()

    metadatas.sort()

    return metadatas


#######################################################################


def main(args):
    parser = optparse.OptionParser()

    parser.add_option('--debug', dest='debug',
                      help='only handle the argument as input and output the result')
    parser.add_option('--log', dest='log',
                      help='log file to use (default: stderr)')
    parser.add_option('--directory', dest='dir', default='.',
                      help='directory where to find data and save data')
    parser.add_option('--save-file', dest='save',
                      help='path to the file where the results will be written')
    parser.add_option('--only-if-old', action='store_true',
                      default=False, dest='only_if_old',
                      help='execute only if the pre-existing result file is older than 10 hours')

    (options, args) = parser.parse_args()

    directory = options.dir

    if options.log:
        path = os.path.realpath(options.log)
        safe_mkdir_p(os.path.dirname(path))
        sys.stderr = open(options.log, 'a')

    if options.debug:
        lines = [ options.debug + '\n' ]
        out = sys.stdout

    else:
        if options.save:
            save_file = options.save
        else:
            save_file = os.path.join(directory, 'versions-fallback')

        if os.path.exists(save_file):
            if not os.path.isfile(save_file):
                print('Save file %s is not a regular file.' % save_file, file=sys.stderr)
                return 1
            if options.only_if_old:
                stats = os.stat(save_file)
                # Quit if it's less than 2-hours old
                if time.time() - stats.st_mtime < 3600 * 2:
                    return 2
        else:
            safe_mkdir_p(os.path.dirname(save_file))

        out = open(save_file, 'w')

    # The default timeout is just too long. Use 10 seconds instead.
    socket.setdefaulttimeout(10)

    ret = 1

    try:
        metadatas = parse_fallback_data()
    except urllib.error.URLError as e:
        print('Error when downloading fallback metadata: %s' % e, file=sys.stderr)
    except urllib.error.HTTPError as e:
        print('Error when downloading fallback metadata: server sent %s' % e, file=sys.stderr)
    else:
        for (name, version) in metadatas:
            out.write('fallback:%s:%s:\n' % (name, version))
        ret = 0

    if not options.debug:
        out.close()

    return ret


if __name__ == '__main__':
    try:
      ret = main(sys.argv)
      sys.exit(ret)
    except KeyboardInterrupt:
      pass
0707010000002B000081ED0000000000000000000000016548EB8C00002DAE000000000000000000000000000000000000004200000000osc-plugin-collab-0.104+30/server/upstream/download-pypi-versions#!/usr/bin/env python3

#
# Copyright (c) 2014, SUSE LINUX Products GmbH
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA  02110-1301
# USA
#
# (Licensed under the LGPLv2.1 or later)
#
#
# Authors: Thomas Bechtold <[email protected]>
#

import argparse
import os
import sys
import time
from multiprocessing import Pool

try:
    import xmlrpc.client
except ImportError:
    # Python3 support
    import xmlrpc.client as xmlrpclib

try:
    from functools import total_ordering
except ImportError:
    def total_ordering(cls):
        """Class decorator that fills in missing ordering methods"""
        convert = {
            '__lt__': [('__gt__', lambda self, other: not (self < other or self == other)),
                       ('__le__', lambda self, other: self < other or self == other),
                       ('__ge__', lambda self, other: not self < other)],
            '__le__': [('__ge__', lambda self, other: not self <= other or self == other),
                       ('__lt__', lambda self, other: self <= other and not self == other),
                       ('__gt__', lambda self, other: not self <= other)],
            '__gt__': [('__lt__', lambda self, other: not (self > other or self == other)),
                       ('__ge__', lambda self, other: self > other or self == other),
                       ('__le__', lambda self, other: not self > other)],
            '__ge__': [('__le__', lambda self, other: (not self >= other) or self == other),
                       ('__gt__', lambda self, other: self >= other and not self == other),
                       ('__lt__', lambda self, other: not self >= other)]
        }
        roots = set(dir(cls)) & set(convert)
        if not roots:
            raise ValueError('must define at least one ordering operation: < > <= >=')
        root = max(roots)       # prefer __lt__ to __le__ to __gt__ to __ge__
        for opname, opfunc in convert[root]:
            if opname not in roots:
                opfunc.__name__ = opname
                opfunc.__doc__ = getattr(int, opname).__doc__
                setattr(cls, opname, opfunc)
        return cls


XMLRPC_SERVER_PROXY = 'https://pypi.python.org/pypi'


@total_ordering
class PackageInfo(object):
    """Represents a package info"""
    def __init__(self, type_, name, version, url):
        self.type = type_.strip()
        self.name = name.strip()  # the original name from pypi (without python3- or python- prefix)
        self.version = version.strip()
        self.url = url.strip()

    def __repr__(self):
        return "PackageInfo obj <%s:%s:%s:%s>" % (self.type, self.name, self.version, self.url)

    def __eq__(self, other):
        if hasattr(other, 'name'):
            return self.name == other.name

    def __lt__(self, other):
        if hasattr(other, 'name'):
            return self.name < other.name


def __get_save_file_serial_name(save_file):
    """filename to use to store the latest used changelog serial from pypi"""
    return save_file + "-changelog-serial"


def __get_cached_changelog_last_serial(save_file):
    """try to read the latest changelog serial and return
    the number or None if not available"""
    serial_file = __get_save_file_serial_name(save_file)
    if os.path.exists(serial_file):
        with open(serial_file, 'r') as f:
            return int(f.readline())
    # no changelog serial available
    return None


def __write_cached_changelog_last_serial(save_file, changelog_serial=None):
    """update the cached changelog serial with the current serial from pypi"""
    if not changelog_serial:
        client = xmlrpc.client.ServerProxy(XMLRPC_SERVER_PROXY)
        changelog_serial = str(client.changelog_last_serial())

    serial_file = __get_save_file_serial_name(save_file)
    with open(serial_file, 'w') as sf:
        sf.write("%s\n" % changelog_serial)


def __read_pypi_package_file(save_file):
    """read the given pypi file into a list of PackageInfo objs"""
    packages = list()
    with open(save_file, "r") as f:
        for line in f.readlines():
            if line.startswith('#'):
                continue
            pack_info = PackageInfo(*line.split(':', 3))
            # skip python3-* lines
            if pack_info.name.startswith('python3-'):
                continue
            # remove python- prefix
            if pack_info.name.startswith('python-'):
                pack_info.name = pack_info.name.replace('python-', '', 1)
            # now pack_info.name should have the original name from pypi
            packages.append(pack_info)
    return packages


def __write_pypi_package_file(save_file, packages):
    """write the pypi file. packages is a list of PackageInfo objs"""
    with open(save_file, "w") as o:
        for pi in sorted(packages):
            if pi:
                # FIXME(toabctl): check somehow if py2 and py3 versions are
                # available currently just write python- and
                # python3- names so both versions can be checked
                o.write("%s:python-%s:%s:%s\n" %
                        (pi.type, pi.name, pi.version, pi.url))
                o.write("%s:python3-%s:%s:%s\n" %
                        (pi.type, pi.name, pi.version, pi.url))


def __create_pypi_package_file(save_file):
    """create a new file with version information for pypi packages.
    This step is expensive because it fetches the package list
    from pypi (> 50000 packages) and then does a request for
    every package to get the version information."""

    # get current changelog serial and store in file before doing anything
    # else so we can't lose some changes while we create the file
    __write_cached_changelog_last_serial(save_file)

    try:
        pack_list = __get_package_list()
        sys.stderr.write(
            "Found %s packages. Getting packages details...\n" %
            (len(pack_list)))
        p = Pool(50)
        packages = p.map(__get_package_info, pack_list)
        __write_pypi_package_file(save_file, packages)
    except Exception as e:
        sys.stderr.write("Error while creating the initial pypi file: %s\n" % e)
        # something wrong - delete the serial file so in the next run the
        # the file will be recreated
        serial_file = __get_save_file_serial_name(save_file)
        os.remove(serial_file)
    else:
        sys.stderr.write("Initial pypi file '%s' created\n" % save_file)


def __find_package_name_index(packages, name):
    """find the given name in the packages list or None if not found"""
    for i, pack in enumerate(packages):
        if pack.name == name:
            return i
    return None


def __update_pypi_package_file(save_file, current_changelog):
    """update information of an exisiting file"""
    try:
        if os.path.exists(save_file):
            packages = __read_pypi_package_file(save_file)
            client = xmlrpc.client.ServerProxy(XMLRPC_SERVER_PROXY)
            changelog_serial = str(client.changelog_last_serial())
            changes = client.changelog_since_serial(current_changelog)
            handled_changes = 0
            sys.stderr.write("Started processing %s change requests...\n" % len(changes))
            for (name, version, timestamp, action, serial) in changes:
                # TODO(toabctl): do it in parallel to improve speed
                if action == 'remove':
                    handled_changes += 1
                    index = __find_package_name_index(packages, name)
                    if index:
                        del packages[index]
                elif action == 'new release':
                    handled_changes += 1
                    updated_pack_info = __get_package_info(name)
                    if updated_pack_info:
                        index = __find_package_name_index(packages, name)
                        if index:
                            packages[index] = updated_pack_info
                        else:
                            packages.append(updated_pack_info)
                else:
                    pass
            # write the new file with the updated package list
            __write_pypi_package_file(save_file, packages)
            __write_cached_changelog_last_serial(save_file, changelog_serial=changelog_serial)
        else:
            raise Exception("Can not update '%s'. File does not exist" % save_file)
    except Exception as e:
        sys.stderr.write("pypi file update for '%s' failed: %s.\n"
                         % (save_file, e))
    else:
        sys.stderr.write("pypi file update for '%s' successful. Handled %s changes\n" % (save_file, handled_changes))


def __get_package_list():
    """get a list with packages available on pypi"""
    client = xmlrpc.client.ServerProxy(XMLRPC_SERVER_PROXY)
    packages_list = client.list_packages()
    return packages_list


def __get_package_info(package):
    """get highest sdist package version for the given package name"""
    try:
        client = xmlrpc.client.ServerProxy(XMLRPC_SERVER_PROXY)
        releases = client.package_releases(package)
        if len(releases) > 0:
            for data in client.release_urls(package, releases[0]):
                if data['packagetype'] == 'sdist':
                    return PackageInfo('pypi', package, releases[0], data['url'])
    except Exception as e:
        sys.stderr.write("can not get information for package '%s': %s\n" % (package, e))
    # no sdist package version found.
    return None


if __name__ == "__main__":
    parser = argparse.ArgumentParser(
        description='Get package version from pypi')
    parser.add_argument(
        '--save-file', default='versions-pypi',
        help='path to the file where the results will be written')
    parser.add_argument(
        '--log', default=sys.stderr,
        help='log file to use (default: stderr)')
    parser.add_argument(
        '--only-if-old', action='store_true', default=False,
        help='execute only if the pre-existing result file is older than 12 hours')


    args = vars(parser.parse_args())

    # check file age
    if os.path.exists(args['save_file']):
        if not os.path.isfile(args['save_file']):
            sys.stderr.write('Save file %s is not a regular file.\n' % args['save_file'])
            sys.exit(1)
        if args['only_if_old']:
            stats = os.stat(args['save_file'])
            # Quit if it's less than 12-hours old
            if time.time() - stats.st_mtime < 3600 * 12:
                sys.exit(2)

    try:
        if args['log'] != sys.stderr:
            sys_stderr = sys.stderr
            sys.stderr = open(args['log'], 'a')

        current_changelog = __get_cached_changelog_last_serial(args['save_file'])
        if current_changelog:
            __update_pypi_package_file(args['save_file'], current_changelog)
        else:
            __create_pypi_package_file(args['save_file'])
    finally:
        if args['log'] != sys.stderr:
            sys.stderr.close()
            sys.stderr = sys_stderr

    sys.exit(0)
0707010000002C000081ED0000000000000000000000016548EB8C000092C1000000000000000000000000000000000000004600000000osc-plugin-collab-0.104+30/server/upstream/download-upstream-versions#!/usr/bin/env python3
# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2008-2009, Novell, Inc.
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA  02110-1301
# USA
#
# (Licensed under the LGPLv2.1 or later)
#
# Parts of this code comes from convert-to-tarball.py (in the releng GNOME
# svn module), which has the same license.
#
#
# Authors: Vincent Untz <[email protected]>
#

import os
import socket
import sys
import time

import ftplib
import hashlib
import optparse
from posixpath import join as posixjoin # Handy for URLs
import re
from sgmllib import SGMLParser
import shutil
import tempfile
import urllib.request, urllib.parse, urllib.error
import urllib.request, urllib.error, urllib.parse
import urllib.parse

import queue
import threading

try:
    from lxml import etree as ET
except ImportError:
    try:
        from xml.etree import cElementTree as ET
    except ImportError:
        import cElementTree as ET

import feedparser

from util import *

USE_DEBUG = False
MAX_THREADS = 10
URL_HASH_ALGO = 'md5'


#######################################################################


def _line_is_comment(line):
    return line.strip() == '' or line[0] == '#'


#######################################################################


# Fix some locations to point to what are really downloads.
def _location_fix(location):
    return location


#######################################################################


class UpstreamRawDownloadError(Exception):
    def __init__(self, value):
        self.msg = value

    def __str__(self):
        return repr(self.msg)


class UpstreamDownloadError(Exception):
    def __init__(self, value):
        self.msg = value

    def __str__(self):
        return repr(self.msg)


#######################################################################


# comes from convert-to-tarball.py
class urllister(SGMLParser):
    def reset(self):
        SGMLParser.reset(self)
        self.urls = []

    def start_a(self, attrs):
        href = [v for k, v in attrs if k=='href']
        if href:
            self.urls.extend(href)


#######################################################################


class svnurllister(SGMLParser):
    def reset(self):
        SGMLParser.reset(self)
        self.urls = []

    def start_file(self, attrs):
        href = [v for k, v in attrs if k=='href']
        if href:
            self.urls.extend(href)


#######################################################################


def _get_cache_paths_from_url(url, cache_dir):
    if cache_dir:
        hash = hashlib.new(URL_HASH_ALGO)
        hash.update(url)
        digest = hash.hexdigest()
        cache = os.path.join(cache_dir, digest)
        cache_error = cache + '.error'
        return (cache, cache_error)
    else:
        return (None, None)


#######################################################################


# based on code from convert-to-tarball.py
def _get_files_from_http(url, cache_dir):
    (cache, cache_error) = _get_cache_paths_from_url (url, cache_dir)

    if cache_error and os.path.exists(cache_error):
        raise UpstreamRawDownloadError(open(cache_error).read())
    elif cache and os.path.exists(cache):
        fin = open(cache)
    else:
        obj = urllib.request.build_opener()
        fin = obj.open(url)

    # Get the files
    parser = urllister()
    parser.feed(fin.read())
    fin.close()
    parser.close()
    files = parser.urls

    return (url, files)


#######################################################################


# based on code from convert-to-tarball.py
def _get_files_from_ftp(url):
    parsed_url = urllib.parse.urlparse(url)

    ftp = ftplib.FTP(parsed_url.hostname)
    ftp.login(parsed_url.username or 'anonymous', parsed_url.password or '')
    ftp.cwd(parsed_url.path)
    files = ftp.nlst()
    ftp.quit()

    return (url, files)


#######################################################################


# based on code from convert-to-tarball.py
def _get_files_from_subdir_http(url, limit):
    obj = urllib.request.build_opener()
    # Note that we accept directories called 1.x.X
    good_dir = re.compile('^(([0-9]+|[xX])\.)*([0-9]+|[xX])/?$')
    def hasdirs(x): return good_dir.search(x)
    def fixdirs(x): return re.sub(r'^((?:(?:[0-9]+|[xX])\.)*)([0-9]+|[xX])/?$', r'\1\2', x)
    location = url
    # Follow 302 codes when retrieving URLs, speeds up conversion by 60sec
    redirect_location = location
    while True:
        # Get the files
        usock = obj.open(redirect_location)
        parser = urllister()
        parser.feed(usock.read())
        usock.close()
        parser.close()
        files = parser.urls

        # Check to see if we need to descend to a subdirectory
        newdirs = list(filter(hasdirs, files))
        newdirs = list(map(fixdirs, newdirs))
        if newdirs:
            newdir = _get_latest_version(newdirs, limit)
            # This is a weird case, that we handled for compiz-fusion:
            # if the dir contains a subdir with the same name, then we stop
            # FIXME: make this an option in the metadata?
            if newdir == os.path.basename(redirect_location):
                break
            if not newdir:
                break

            # Add trailing slash since we work on directories, and some servers
            # don't work if we don't add the trailing slash (like lighttpd on
            # http://download.banshee.fm/banshee-community-extensions/)
            if newdir[-1] != '/':
                newdir += '/'

            redirect_location = posixjoin(usock.url, newdir)
            location = posixjoin(location, newdir)
        else:
            break

    return (location, files)


#######################################################################


def _get_files_from_svn(url):
    obj = urllib.request.build_opener()

    # Get the files
    usock = obj.open(url)
    parser = svnurllister()
    parser.feed(usock.read())
    usock.close()
    parser.close()
    files = parser.urls

    return (url, files)


#######################################################################


def _setup_limit(limit):
    limit_data = None

    if not limit:
        pass
    elif limit == 'no-odd-unstable':
        pass
    elif limit[:4] == 'max|':
        limit_data = limit[4:]
        limit = 'max'
    elif limit[:5] == 'skip|':
        limit_data = limit[5:]
        limit = 'skip'
    else:
        print('Unsupported limit: %s' % limit, file=sys.stderr)
        limit = None

    return (limit, limit_data)

def _respect_limit(version, limit, limit_data):
    if not limit:
        return True
    elif limit == 'no-odd-unstable':
        # remove the part after dashes. Eg, in 3.3-1, we're only interested in
        # the 3.3 part.
        version = version.split('-')[0]
        split_version = version.split('.')
        if len(split_version) <= 1:
            # not enough version data, so let's just say yes
            return True

        try:
            return int(split_version[1]) % 2 == 0
        except:
            # second element is not an int. Let's just say yes
            return True

    elif limit == 'max':
        return version_gt(limit_data, version)

    elif limit == 'skip':
        skip_versions = [ x for x in limit_data.split(';') if x ]
        return not (version in skip_versions)

    else:
        return False


def _get_latest_version(versions, limit):
    (limit, limit_data) = _setup_limit(limit)

    biggest = None
    for version in versions:
        if _respect_limit(version, limit, limit_data):
            biggest = version
            break

    if not biggest:
        return None

    for version in versions[versions.index(biggest) + 1:]:
        if _respect_limit(version, limit, limit_data) and version_gt(version, biggest):
            biggest = version

    return biggest


#######################################################################


def _all_indexes(list, item, shift = 0):
    try:
        i = list.index(item)
        real_index = i + shift
    except ValueError:
        return []

    subresult = _all_indexes(list[i+1:], item, real_index + 1)
    subresult.append(real_index)
    return subresult


# based on code from convert-to-tarball.py
def _get_version_from_files(modulename, location, files, limit):
    def is_of_interest(modulename, file):
        ''' The file is of interest if it contains the module name, and if
            either the basename of the URI matches a tarball for this module,
            or there's a query (like /download.cgi?filename=module-x.y.tar.gz)
        '''
        if not modulename in file:
            return False

        parsed = urllib.parse.urlparse(file)
        if os.path.basename(parsed.path).startswith(modulename):
            return True

        if parsed.query:
            return True

        return False

    # Only include tarballs for the given module
    tarballs = [file for file in files if is_of_interest(modulename, file)]

    # Remove fragment identifiers (anchors)
    tarballs_new = []
    for tarball in tarballs:
        index = tarball.rfind('#')
        if index != -1:
            tarball = tarball[:index]
        tarballs_new.append(tarball)
    tarballs = tarballs_new

    re_tarball = r'^.*'+modulename+'[_-]v?(([0-9]+[\.\-])*[0-9]+)\.(?:tar.*|t[bg]z2?)$'
    # Don't include -beta -installer -stub-installer and all kinds of
    # other weird-named tarballs
    tarballs = [t for t in tarballs if re.search(re_tarball, t)]

    versions = [re.sub(re_tarball, r'\1', t) for t in tarballs]

    if not len(versions):
        raise UpstreamDownloadError('No versions found')

    version = _get_latest_version(versions, limit)

    if not version:
        raise UpstreamDownloadError('No version found respecting the limits')

    indexes = _all_indexes(versions, version)
    # the list is not in the right order, because of the way we build the list
    indexes.reverse()

    latest = [tarballs[index] for index in indexes]

    tarballs = None
    if not tarballs:
        tarballs = [file for file in latest if file.endswith('.tar.xz')]
    if not tarballs:
        tarballs = [file for file in latest if file.endswith('.tar.bz2')]
    if not tarballs:
        tarballs = [file for file in latest if file.endswith('.tar.gz')]
    if not tarballs:
        tarballs = [file for file in latest if file.endswith('.tbz2')]
    if not tarballs:
        tarballs = [file for file in latest if file.endswith('.tgz')]

    if not tarballs:
        raise UpstreamDownloadError('No tarball found for version %s' % version)

    # at this point, all the tarballs we have are relevant, so just take the
    # first one
    tarball = tarballs[0]

    if urllib.parse.urlparse(tarball).scheme != '':
        # full URI
        location = tarball
    else:
        # remove files from location when we know it's not a directory
        if len(location) > 5 and location[-5:] in [ '.html' ]:
            last_slash = location.rfind('/')
            if last_slash != -1:
                location = location[:last_slash + 1]
        # add potentially missing slash to the directory
        if location[-1:] != '/':
            location = location + '/'
        location = urllib.parse.urljoin(location, tarball)

    return (location, version)


#######################################################################


def _fix_sf_location(url):
    if url and url.startswith('http://sourceforge.net/projects/') and url.endswith('/download'):
        # We want to move from:
        #    http://sourceforge.net/projects/gtkpod/files%2Fgtkpod%2Fgtkpod-2.0.2%2Fgtkpod-2.0.2.tar.gz/download
        # to:
        #    http://downloads.sourceforge.net/project/gtkpod/gtkpod/gtkpod-2.0.2/gtkpod-2.0.2.tar.gz

        # strip leading 'http://sourceforge.net/projects/' and trailing '/download'
        stripped = url[len('http://sourceforge.net/projects/'):-len('/download')]
        # find project name
        prjname = stripped[:stripped.find('/')]
        # find path to file
        files_index = stripped.find('/files%2F')
        if files_index != -1:
            path = stripped[files_index + len('/files%2F'):]
            path = path.replace('%2F', '/')
        else:
            files_index = stripped.find('/files/')
            path = stripped[files_index + len('/files/'):]

        return 'http://downloads.sourceforge.net/project/%s/%s' % (prjname, path)

    return url

def _get_version_from_sf_rss(modulename, id, limit):
    (limit, limit_data) = _setup_limit(limit)

    ids = id.split('|')
    url = 'http://sourceforge.net/api/file/index/project-id/%s/rss' % ids[0]
    if len(ids) > 1:
        # we do not want urlencode since spaces are %20 and not +
        url += '?path=/%s' % urllib.parse.quote(ids[1])

    feed = feedparser.parse(url)

    re_tarball = re.compile(r'^.*(?:/|%2F)'+re.escape(modulename)+'[_-](([0-9]+[\.\-])*[0-9]+)\.(tar.*|t[bg]z2?)/')

    biggest = '0'
    location = None
    best_ext = None

    for entry in feed['entries']:
        unquoted_link = urllib.parse.unquote(entry.link)
        match = re_tarball.match(unquoted_link)
        if not match:
            continue

        version = match.group(1)
        ext = match.group(2)
        if not version_gt(version, biggest):
            continue
        if not _respect_limit(version, limit, limit_data):
            continue

        if biggest == version:
            if best_ext in [ '.tar.xz' ]:
                continue
            elif ext in [ '.tar.xz' ]:
                pass
            if best_ext in [ '.tar.bz2', '.tbz2' ]:
                continue
            elif ext in [ '.tar.bz2', '.tbz2' ]:
                pass
            elif best_ext in [ '.tar.gz', '.tgz' ]:
                continue
            elif ext in [ '.tar.gz', '.tgz' ]:
                pass
            else:
                continue

        biggest = version
        location = entry.link
        best_ext = ext

    if biggest == '0' and location == None:
        biggest = None

    location = _fix_sf_location(location)

    return (location, biggest)


#######################################################################


def _fix_sf_jp_location(location, project):
    sf_jp = re.compile('^http://sourceforge.jp/projects/%s/downloads/([^/]+)/([^/]+)$' % project)
    match = sf_jp.match(location)
    if match:
        return 'http://sourceforge.jp/frs/redir.php?m=jaist&f=%%2F%s%%2F%s%%2F%s' % (project, match.group(1), match.group(2))

    return location


def _get_version_from_sf_jp(cache_dir, modulename, project, limit):
    url = 'http://sourceforge.jp/projects/%s/releases/' % project
    (location, files) = _get_files_from_http(url, cache_dir)

    # there's a trailing slash that will break _get_version_from_files(). For instance:
    # http://sourceforge.jp/projects/scim-imengine/downloads/29155/scim-canna-1.0.1.tar.gz/
    files = [ f[:-1] for f in files ]
    (location, version) = _get_version_from_files(modulename, location, files, limit)

    location = _fix_sf_jp_location(location, project)
    return (location, version)


#######################################################################


def _get_version_from_google_atom(name, limit):
    (limit, limit_data) = _setup_limit(limit)

    names = name.split('|')
    project = names[0]
    if len(names) > 1:
        tarball = names[1]
    else:
        tarball = project

    # See http://code.google.com/p/support/issues/detail?id=2926
    #url = 'http://code.google.com/feeds/p/%s/downloads/basic?%s' % (project, urllib.urlencode({'q': tarball}))
    url = 'http://code.google.com/feeds/p/%s/downloads/basic' % (project, )

    feed = feedparser.parse(url)

    version_re = re.compile('^\s*'+re.escape(tarball)+'[_-]((?:[0-9]+\.)*[0-9]+)\.(tar.*|t[bg]z2?)')
    download_re = re.compile('<a href="([^"]*)">Download</a>')

    biggest = '0'
    location = None

    for entry in feed['entries']:
        match = version_re.match(entry.title)
        if not match:
            continue

        version = match.group(1)
        if not version_gt(version, biggest):
            continue
        if not _respect_limit(version, limit, limit_data):
            continue

        match = download_re.search(entry.content[0]['value'])
        if match:
            download_url = match.group(1)
        else:
            download_url = 'http://code.google.com/p/%s/downloads/list' % project
        biggest = version
        location = download_url

    if biggest == '0' and location == None:
        raise UpstreamDownloadError('No versions found')

    return (location, biggest)


#######################################################################


LP_NS = '{https://launchpad.net/rdf/launchpad#}'
RDF_NS = '{http://www.w3.org/1999/02/22-rdf-syntax-ns#}'

def _get_version_from_launchpad_series(project, limit, limit_data, series):
    url = 'https://launchpad.net/%s/%s/+rdf' % (project, series)
    release_re = re.compile('^/%s/%s/((?:[0-9]+\.)*[0-9]+)/\+rdf$' % (re.escape(project), re.escape(series)))
    biggest = '0'

    fd = urllib.request.urlopen(url)
    root = ET.parse(fd).getroot().find(LP_NS + 'ProductSeries')
    fd.close()

    for node in root.findall(LP_NS + 'release'):
        productrelease = node.find(LP_NS + 'ProductRelease')
        if productrelease is None:
            continue
        specified = productrelease.find(LP_NS + 'specifiedAt')
        release = specified.get(RDF_NS + 'resource')
        match = release_re.match(release)
        if not match:
            continue
        version = match.group(1)

        if not _respect_limit(version, limit, limit_data):
            continue

        if version_gt(version, biggest):
            biggest = version

    # TODO: this is blocked by https://bugs.launchpad.net/bugs/268359
    location = None

    return (location, biggest)


def _get_version_from_launchpad(project, limit):
    (limit, limit_data) = _setup_limit(limit)

    url = 'https://launchpad.net/%s/+rdf' % project
    series_re = re.compile('^/%s/((?:[0-9]+\.)*[0-9]+)/\+rdf$' % re.escape(project))

    fd = urllib.request.urlopen(url)
    root = ET.parse(fd).getroot().find(LP_NS + 'Product')
    fd.close()

    # always finish with trunk
    (location, biggest) = (None, '0')

    for node in root.findall(LP_NS + 'series'):
        product = node.find(LP_NS + 'ProductSeries')
        if product is None:
            continue
        specified = product.find(LP_NS + 'specifiedAt')
        series = specified.get(RDF_NS + 'resource')
        match = series_re.match(series)
        if not match:
            continue
        series_version = match.group(1)

        if not _respect_limit(series_version, limit, limit_data):
            continue

        if version_ge(biggest, series_version):
            continue

        (series_location, series_biggest) = _get_version_from_launchpad_series (project, limit, limit_data, series_version)
        if version_gt(series_biggest, biggest):
            (location, biggest) = (series_location, series_biggest)

    # try trunk, in case it exists
    try:
        (trunk_location, trunk_biggest) = _get_version_from_launchpad_series (project, limit, limit_data, 'trunk')
        if version_gt(trunk_biggest, biggest):
            (location, biggest) = (trunk_location, trunk_biggest)
    except UpstreamDownloadError:
        pass
    except urllib.error.HTTPError as e:
        if e.code != 404:
            raise e

    if location is None and biggest == '0':
        raise UpstreamDownloadError('No versions found')

    return (location, biggest)



#######################################################################


class trac_urllister(SGMLParser):
    def __init__(self, modulename):
        SGMLParser.__init__(self)
        self.modulename = modulename

    def reset(self):
        SGMLParser.reset(self)
        self.in_a = False
        self.current_url = None
        self.files = []

    def start_a(self, attrs):
        self.in_a = True
        href = [v for k, v in attrs if k=='href']
        if href:
            self.current_url = href[0]

    def handle_data(self, data):
        data = data.strip()
        if self.in_a and self.modulename in data:
            self.files.append([self.current_url, data])

    def end_a(self):
        self.in_a = False


def _get_version_from_trac(modulename, url, limit):
    # this is clearly based on _get_version_from_files, so read comments there

    obj = urllib.request.build_opener()

    # Get the files
    usock = obj.open(url)
    parser = trac_urllister(modulename)
    parser.feed(usock.read())
    usock.close()
    parser.close()
    files = parser.files

    (limit, limit_data) = _setup_limit(limit)
    re_tarball = r'^.*'+modulename+'[_-](([0-9]+[\.\-])*[0-9]+)\.(?:tar.*|t[bg]z2?)$'
    tarballs = [t for t in files if re.search(re_tarball, t[1])]
    versions = [re.sub(re_tarball, r'\1', t[1]) for t in tarballs]
    version = _get_latest_version(versions, limit)

    indexes = _all_indexes(versions, version)
    # the list is not in the right order, because of the way we build the list
    indexes.reverse()

    latest = [tarballs[index] for index in indexes]

    tarballs = None
    if not tarballs:
        tarballs = [file for file in latest if file[1].endswith('.tar.xz')]
    if not tarballs:
        tarballs = [file for file in latest if file[1].endswith('.tar.bz2')]
    if not tarballs:
        tarballs = [file for file in latest if file[1].endswith('.tar.gz')]
    if not tarballs:
        tarballs = [file for file in latest if file[1].endswith('.tbz2')]
    if not tarballs:
        tarballs = [file for file in latest if file[1].endswith('.tgz')]

    if not tarballs:
        raise UpstreamDownloadError('No tarball found for version %s' % version)

    # first tarball is fine
    tarball = tarballs[0]
    semi_url = tarball[0]

    if urllib.parse.urlparse(semi_url).scheme != '':
        # full URI
        location = semi_url
    else:
        location = urllib.parse.urljoin(url, semi_url)

    return (location, version)


#######################################################################


def get_upstream_version(cache_dir, modulename, method, additional_info, limit):
    # for branches, get the real modulename
    modulename = modulename.split('|')[0]

    if method not in [ 'upstream', 'ftpls', 'httpls', 'dualhttpls', 'subdirhttpls', 'svnls', 'sf', 'sf_jp', 'google', 'lp', 'trac' ]:
        print('Unsupported method: %s' % method, file=sys.stderr)
        return (None, None)

    if method == 'upstream':
        return (None, '--')

    elif method == 'ftpls':
        (location, files) = _get_files_from_ftp(additional_info)
        return _get_version_from_files(modulename, location, files, limit)

    elif method == 'httpls':
        (location, files) = _get_files_from_http(additional_info, cache_dir)
        return _get_version_from_files(modulename, location, files, limit)

    elif method == 'dualhttpls':
        (url1, url2) = additional_info.split('|')
        (location1, files1) = _get_files_from_http(url1, cache_dir)
        (location2, files2) = _get_files_from_http(url2, cache_dir)
        try:
            (location1, version1) = _get_version_from_files(modulename, location1, files1, limit)
        except UpstreamDownloadError:
            (location1, version1) = (None, None)

        try:
            (location2, version2) = _get_version_from_files(modulename, location2, files2, limit)
        except UpstreamDownloadError:
            (location2, version2) = (None, None)

        if version1 and version2 and version_ge(version1, version2):
            return (location1, version1)
        elif version1 and version2:
            return (location2, version2)
        elif version1:
            return (location1, version1)
        elif version2:
            return (location2, version2)
        else:
            raise UpstreamDownloadError('No versions found')

    elif method == 'subdirhttpls':
        (location, files) = _get_files_from_subdir_http(additional_info, limit)
        return _get_version_from_files(modulename, location, files, limit)

    elif method == 'svnls':
        (location, files) = _get_files_from_svn(additional_info)
        return _get_version_from_files(modulename, location, files, limit)

    elif method == 'sf':
        return _get_version_from_sf_rss(modulename, additional_info, limit)

    elif method == 'sf_jp':
        return _get_version_from_sf_jp(cache_dir, modulename, additional_info, limit)

    elif method == 'google':
        return _get_version_from_google_atom(additional_info, limit)

    elif method == 'lp':
        return _get_version_from_launchpad(additional_info, limit)

    elif method == 'trac':
        return _get_version_from_trac(modulename, additional_info, limit)


#######################################################################


def parse_limits(limits_file):
    retval = {}

    if not os.path.exists(limits_file) or not os.path.isfile(limits_file):
        return retval

    file = open(limits_file)
    lines = file.readlines()
    file.close()

    for line in lines:
        if _line_is_comment(line):
            continue

        data = line[:-1].split(':', 2)
        retval[data[0]] = data[1]

    return retval


#######################################################################


def parse_data(data_file):
    retval = {}

    if not os.path.exists(data_file) or not os.path.isfile(data_file):
        return retval

    file = open(data_file)
    lines = file.readlines()
    file.close()

    for line in lines:
        if _line_is_comment(line):
            continue

        data = line[:-1].split(':', 3)
        if data[0] != 'nonfgo':
            continue

        if data[2] != '':
            version = data[2]
        else:
            version = None

        if data[3] != '':
            location = data[3]
        else:
            location = None

        retval[data[1]] = (version, location)

    return retval


#######################################################################


def do_task(cache_dir, modulename, method, additional_info, fast_update, limits, fallback_data):
    (location, version) = (None, None)

    if fast_update and modulename in fallback_data and fallback_data[modulename][0]:
        # fast update: we don't download data if we have something in cache
        pass
    else:
        if modulename in limits:
            limit = limits[modulename]
        else:
            limit = None

        try:
            (location, version) = get_upstream_version(cache_dir, modulename, method, additional_info, limit)
        except urllib.error.URLError as e:
            print('Error when downloading information about %s: %s' % (modulename, e), file=sys.stderr)
        except urllib.error.HTTPError as e:
            print('Error when downloading information about %s: server sent %s' % (modulename, e.code), file=sys.stderr)
        except ftplib.all_errors as e:
            print('Error when downloading information about %s: %s' % (modulename, e), file=sys.stderr)
        except socket.timeout as e:
            print('Error when downloading information about %s: %s' % (modulename, e), file=sys.stderr)
        except UpstreamRawDownloadError as e:
            print('Error when downloading information about %s: %s' % (modulename, e.msg), file=sys.stderr)
        except UpstreamDownloadError as e:
            print('No matching tarball found for %s: %s' % (modulename, e.msg), file=sys.stderr)

    if modulename in fallback_data:
        fallback_version = fallback_data[modulename][0]
        fallback_location = fallback_data[modulename][1]

        if not version and not location:
            version = fallback_version
            location = fallback_location
        elif not version and location == fallback_location:
            version = fallback_version
        elif not location and version == fallback_version:
            location = fallback_location

    if version == '--':
        cat = 'upstream'
    else:
        cat = 'nonfgo'

    if location:
        location = _location_fix(location)

    return (cat, version, location)


#######################################################################


def do_cache(cache_dir, url):
    (cache, cache_error) = _get_cache_paths_from_url (url, cache_dir)

    fin = None
    fout = None
    error = None

    try:
        obj = urllib.request.build_opener()
        fin = obj.open(url)
        fout = open(cache, 'w')

        fout.write(fin.read())
    except urllib.error.URLError as e:
        error = str(e)
    except urllib.error.HTTPError as e:
        error = 'server sent %s' % e.code
    except socket.timeout as e:
        error = str(e)

    if fin:
        fin.close()
    if fout:
        fout.close()

    if error:
        fout = open(cache_error, 'w')
        fout.write(error)
        fout.close


#######################################################################


def debug_thread(s):
    global USE_DEBUG

    if not USE_DEBUG:
        return

    # compatibility with old versions of python (< 2.6)
    if hasattr(threading.currentThread(), 'name'):
        name = threading.currentThread().name
    else:
        name = threading.currentThread().getName()

    print('%s: %s' % (name, s))


#######################################################################


def thread_cache(task_cache, cache_dir):
    try:
        while True:
            if task_cache.empty():
                break

            url = task_cache.get()
            debug_thread('starting caching %s' % url)

            try:
                do_cache(cache_dir, url)
            except Exception as e:
                print('Exception in worker thread for caching %s: %s' % (url, e), file=sys.stderr)

            task_cache.task_done()

    except queue.Empty:
        pass


#######################################################################


def thread_upstream(cache_dir, task, result, fast_update, limits, fallback_data):
    try:
        while True:
            if task.empty():
                break

            (modulename, method, additional_info) = task.get()
            debug_thread('starting %s' % modulename)

            try:
                (cat, version, location) = do_task(cache_dir, modulename, method, additional_info, fast_update, limits, fallback_data)
                result.put((cat, modulename, version, location))
            except Exception as e:
                print('Exception in worker thread for %s: %s' % (modulename, e), file=sys.stderr)

            task.task_done()

    except queue.Empty:
        pass


#######################################################################


def start_threads(task_cache, cache_dir, task, result, fast_update, limits, fallback_data):
    if cache_dir:
        # First do all the cache tasks
        for i in range(min(MAX_THREADS, task.qsize())):
            t = threading.Thread(target=thread_cache, args=(task_cache, cache_dir))
            t.start()

        task_cache.join()

    # Then do all the remaining tasks
    for i in range(min(MAX_THREADS, task.qsize())):
        t = threading.Thread(target=thread_upstream, args=(cache_dir, task, result, fast_update, limits, fallback_data))
        t.start()

    task.join()


#######################################################################


def main(args):
    parser = optparse.OptionParser()

    parser.add_option('--debug', dest='debug',
                      help='only handle the argument as input and output the result')
    parser.add_option('--log', dest='log',
                      help='log file to use (default: stderr)')
    parser.add_option('--directory', dest='dir', default='.',
                      help='directory where to find data and save data')
    parser.add_option('--save-file', dest='save',
                      help='path to the file where the results will be written')
    parser.add_option('--upstream-limits', dest='upstream_limits',
                      help='path to the upstream limits data file')
    parser.add_option('--upstream-tarballs', dest='upstream_tarballs',
                      help='path to the upstream tarballs data file')
    parser.add_option('--fast-update', action='store_true',
                      default=False, dest='fast_update',
                      help='when available, use old saved data instead of looking for new data (limits will be ignored)')
    parser.add_option('--use-old-as-fallback', action='store_true',
                      default=False, dest='fallback',
                      help='if available, use old saved data as a fallback for when we cannot find new data (limits will be ignored for the fallback case)')
    parser.add_option('--only-if-old', action='store_true',
                      default=False, dest='only_if_old',
                      help='execute only if the pre-existing result file is older than 10 hours')

    (options, args) = parser.parse_args()

    fallback_data = {}

    directory = options.dir

    if options.log:
        path = os.path.realpath(options.log)
        safe_mkdir_p(os.path.dirname(path))
        sys.stderr = open(options.log, 'a')

    if options.upstream_limits:
        limit_file = options.upstream_limits
    else:
        limit_file = os.path.join(directory, 'upstream-limits.txt')

    limits = parse_limits(limit_file)

    if options.debug:
        lines = [ options.debug + '\n' ]
        out = sys.stdout

    else:
        if options.upstream_tarballs:
            upstream_file = options.upstream_tarballs
        else:
            upstream_file = os.path.join(directory, 'upstream-tarballs.txt')

        if options.save:
            save_file = options.save
        else:
            save_file = os.path.join(directory, 'versions-upstream')

        if not os.path.exists(upstream_file):
            print('Upstream data file %s does not exist.' % upstream_file, file=sys.stderr)
            return 1
        elif not os.path.isfile(upstream_file):
            print('Upstream data file %s is not a regular file.' % upstream_file, file=sys.stderr)
            return 1

        if os.path.exists(save_file):
            if not os.path.isfile(save_file):
                print('Save file %s is not a regular file.' % save_file, file=sys.stderr)
                return 1
            if options.only_if_old:
                stats = os.stat(save_file)
                # Quit if it's less than 10-hours old
                if time.time() - stats.st_mtime < 3600 * 10:
                    return 2

            if options.fallback or options.fast_update:
                fallback_data = parse_data(save_file)
        else:
            safe_mkdir_p(os.path.dirname(save_file))

        file = open(upstream_file)
        lines = file.readlines()
        file.close()

        out = open(save_file, 'w')

    # The default timeout is just too long. Use 10 seconds instead.
    socket.setdefaulttimeout(10)

    to_cache = set()
    task_cache = queue.Queue()
    task = queue.Queue()
    result = queue.Queue()

    for line in lines:
        if _line_is_comment(line):
            continue

        (modulename, method, additional_info) = line[:-1].split(':', 2)

        # We will do a cache prefetch of all http url, so that we don't have to
        # open the same url several times
        if method in [ 'httpls', 'dualhttpls' ]:
            if method == 'httpls':
                to_cache.add(additional_info)
            elif method == 'dualhttpls':
                (url1, url2) = additional_info.split('|')
                to_cache.add(url1)
                to_cache.add(url2)

        task.put((modulename, method, additional_info))

    # We'll have to remove this temporary cache
    if not options.debug:
        cache_dir = tempfile.mkdtemp(prefix='upstream-cache-', dir=os.path.dirname(save_file))
    else:
        cache_dir = None

    for url in to_cache:
        task_cache.put(url)

    start_threads(task_cache, cache_dir, task, result, options.fast_update, limits, fallback_data)

    while not result.empty():
        (cat, modulename, version, location) = result.get()

        if version and location:
            out.write('%s:%s:%s:%s\n' % (cat, modulename, version, location))
        elif version:
            out.write('%s:%s:%s:\n' % (cat, modulename, version))
        elif location:
            out.write('%s:%s::%s\n' % (cat, modulename, location))
        else:
            out.write('%s:%s::\n' % (cat, modulename))

        result.task_done()

    # Remove temporary cache
    if cache_dir:
        shutil.rmtree(cache_dir)

    if not options.debug:
        out.close()

    return 0


if __name__ == '__main__':
    try:
      ret = main(sys.argv)
      sys.exit(ret)
    except KeyboardInterrupt:
      pass
0707010000002D000041ED0000000000000000000000026548EB8C00000000000000000000000000000000000000000000003A00000000osc-plugin-collab-0.104+30/server/upstream/gnome-versions0707010000002E000081ED0000000000000000000000016548EB8C000043E9000000000000000000000000000000000000004C00000000osc-plugin-collab-0.104+30/server/upstream/gnome-versions/generate-versions#!/usr/bin/python3
# vim: set sw=4 ts=4 et:

import errno
import os
import sys

import json
import optparse
from operator import attrgetter
import urllib.parse
import rpm

try:
    from lxml import etree as ET
except ImportError:
    try:
        from xml.etree import cElementTree as ET
    except ImportError:
        import cElementTree as ET

JSON_FORMAT = 4

BRANCH_LIMITS = {
    'glib': '1.3',
    'gnome-desktop': '2.90',
    'gnome-menus': '3.1',
    'goffice': '0.9',
    'goocanvas': '1.90',
    'gtk+': '1.3',
    'gtk-engines': '2.90',
    'gtkmm': '2.90',
    'gtkmm-documentation': '2.90',
    'gtksourceview': ('1.9', '2.11'),
    'gtksourceviewmm': '2.90',
    'libgda': ('3.99', '4.99'),
    'libgnomedb': '3.99',
    'libsigc++': ('1.3', '2.99'),
    'libunique': '2',
    'libwnck': '2.90',
    'pygobject': '2.29',
    'vala': '0.13',
    'vala': '0.15',
    'vte': '0.29',
# modules with an unstable branch as current branch
    'gmime': '2.5'
}

STABLE_BRANCH_SAME_LIMITS = {
# Modules with the same branch as something in the modulesets
    'anjuta-extras': 'anjuta',
    'eog-plugins': 'eog',
    'epiphany-extensions': 'epiphany',
    'evolution': 'evolution-data-server',
    'evolution-ews': 'evolution-data-server',
    'evolution-exchange': 'evolution-data-server',
    'evolution-groupwise': 'evolution-data-server',
    'evolution-kolab': 'evolution-data-server',
    'evolution-mapi': 'evolution-data-server',
    'gdl': 'anjuta',
    # Gone in 3.10:
    #'gnome-applets': 'gnome-panel',
    'gnome-shell-extensions': 'gnome-shell'
}

STABLE_BRANCHES_LIMITS = {
    '3.4': {
        'NetworkManager-openconnect': '0.9.5.0',
        'NetworkManager-openswan': '0.9.5.0',
        'NetworkManager-openvpn': '0.9.5.0',
        'NetworkManager-pptp': '0.9.5.0',
        'NetworkManager-vpnc': '0.9.5.0',
        'ghex': '3.5',
        'gtkhtml': '4.5',
        'libgda': '5.1',
        'libgdata': '0.13',
        'pyatspi': '2.5',
        'tomboy': '1.11'
     },
    '3.6': {
        'NetworkManager-openconnect': '0.9.7.0',
        'NetworkManager-openswan': '0.9.7.0',
        'NetworkManager-openvpn': '0.9.7.0',
        'NetworkManager-pptp': '0.9.7.0',
        'NetworkManager-vpnc': '0.9.7.0',
        'alacarte': '3.7',
        'ghex': '3.7',
        'glom': '1.23',
        'gnote': '3.7',
        'gtkhtml': '4.7',
        'libgda': '5.3',
        'libgdata': '0.15',
        'pyatspi': '2.7',
        'tomboy': '1.13'
     },
    '3.8': {
        'NetworkManager-openconnect': '0.9.9.0',
        'NetworkManager-openswan': '0.9.9.0',
        'NetworkManager-openvpn': '0.9.9.0',
        'NetworkManager-pptp': '0.9.9.0',
        'NetworkManager-vpnc': '0.9.9.0',
        'alacarte': '3.9',
        'ghex': '3.9',
        'glom': '1.25',
        'gnome-applets': '3.9',
        'gnome-panel': '3.9',
        'gnote': '3.9',
        'gtkhtml': '4.7',
        'libgda': '5.3',
        'pyatspi': '2.9',
        'tomboy': '1.15'
     },
    '3.10': {
        'gnome-applets': '3.11',
        'gnome-panel': '3.11'
     },
    '3.12': {
        'gnome-applets': '3.13',
        'gnome-panel': '3.13'
     },
    '3.14': {
        'gnome-applets': '3.15',
        'gnome-panel': '3.15'
     }
}

BLACKLISTED_SOURCES = [
    # Sources not using ftpadmin
    #Seems to use it now: 'banshee',
    # Sources that are now hosted elsewhere (and listing them from
    # ftp.gnome.org can be an issue).
    'abiword',
    'balsa'
    'clutter-gst',
    'gimp',
    'gnucash',
    'gst-python',
    'g-wrap',
    'intltool',
    'libgnomesu',
    'librep',
    'pkg-config',
    'rep-gtk',
    'sawfish',
    'startup-notification',
    'xchat',
    # Sources that we know have no cache.json
    'librep-2002-03',
    'rep-gtk-2002-03',
    'sawfish-2002-03',
    'xpenguins_applet',
    'labyrinth_0.4.0',
    'labyrinth_0.4.0rc3',
    'delme'
]

##################################################################
# All this code is taken from osc-plugin-collab
##################################################################

def safe_mkdir_p(dir):
    if not dir:
        return

    try:
        os.makedirs(dir)
    except OSError as e:
        if e.errno != errno.EEXIST:
            raise e

##################################################################
# End of code taken from osc-plugin-collab
##################################################################

##################################################################
# All this code is taken from convert-to-tarballs.py
##################################################################

def _bigger_version(a, b):
    rc = rpm.labelCompare(("0", a, "0"), ("0", b, "0"))
    if rc > 0:
        return a
    else:
        return b

# This is nearly the same as _bigger_version, except that
#   - It returns a boolean value
#   - If max_version is None, it just returns False
#   - It treats 2.13 as == 2.13.0 instead of 2.13 as < 2.13.0
# The second property is particularly important with directory hierarchies
def _version_greater_or_equal_to_max(a, max_version):
    if not max_version:
        return False
    rc = rpm.labelCompare(("0", a, "0"), ("0", max_version , "0"))
    if rc >= 0:
        return True
    else:
        return False

def _get_latest_version(versions, max_version):
    biggest = versions[0]
    for version in versions[1:]:
        # Just ignore '-' in versions
        if version.find('-') >= 0:
            version = version[:version.find('-')]
        if (version == _bigger_version(biggest, version) and \
            not _version_greater_or_equal_to_max(version, max_version)):
            biggest = version
    return biggest

##################################################################
# End of code taken from convert-to-tarballs.py
##################################################################


class Module:
    ''' Object representing a module '''

    def __init__(self, name, limit):
        self.name = name
        self.limit = limit
        self.version = ''

    def fetch_version(self, all_versions):
        if self.name not in all_versions:
            return
        versions = all_versions[self.name]
        latest = _get_latest_version(versions, self.limit)
        if latest:
            self.version = latest

    def get_str(self, release_set, subdir = None):
        if not self.version:
            prefix = '#'
        else:
            prefix = ''

        release_set = 'fgo'
        if subdir:
            return '%s%s:%s:%s:%s\n' % (prefix, release_set, self.name, self.version, subdir)
        else:
            return '%s%s:%s:%s:\n' % (prefix, release_set, self.name, self.version)


class SubReleaseSet:
    ''' Object representing a sub-release set (like the bindings) (made of
        modules)
    '''

    def __init__(self, name):
        self.name = name
        self.modules = []

    def add(self, module):
        self.modules.append(module)

    def fetch_versions(self, all_versions):
        for module in self.modules:
            module.fetch_version(all_versions)

    def get_str(self, release_set):
        # Sort by name, then version (sorts are stable)
        self.modules.sort(key=attrgetter('version'))
        self.modules.sort(key=attrgetter('name'))

        res = '# %s\n' % self.name.title()
        for module in self.modules:
            res += module.get_str(release_set, self.name)
        res += '\n'

        return res


class ReleaseSet:
    ''' Object representing a release set (made of modules, and sub-release
        sets, like the bindings ones)
    '''
    
    def __init__(self, name):
        self.name = name
        self.subrelease_sets = []
        self.modules = []

    def add(self, module, subdir):
        if subdir:
            sub = self.find_subrelease_set(subdir)
            if sub is None:
                sub = SubReleaseSet(subdir)
                self.subrelease_sets.append(sub)
            sub.add(module)
        else:
            self.modules.append(module)

    def find_subrelease_set(self, subrelease_set):
        for sub in self.subrelease_sets:
            if sub.name == subrelease_set:
                return sub
        return None

    def fetch_versions(self, all_versions):
        for module in self.modules:
            module.fetch_version(all_versions)
        for sub in self.subrelease_sets:
            sub.fetch_versions(all_versions)

    def get_all_modules(self):
        res = []
        res.extend(self.modules)
        for sub in self.subrelease_sets:
            res.extend(sub.modules)
        return res

    def get_str(self):
        # Sort by name, then version (sorts are stable)
        self.modules.sort(key=attrgetter('version'))
        self.modules.sort(key=attrgetter('name'))
        self.subrelease_sets.sort(key=attrgetter('name'))

        res = '## %s\n' % self.name.upper()
        for module in self.modules:
            res += module.get_str(self.name)
        res += '\n'
        for sub in self.subrelease_sets:
            res += sub.get_str(self.name)

        return res


class Release:
    ''' Object representing a release (made of release sets) '''

    def __init__(self):
        self.release_sets = []

    def add(self, release_set, module, subdir):
        rel_set = self.find_release_set(release_set)
        if rel_set is None:
            rel_set = ReleaseSet(release_set)
            self.release_sets.append(rel_set)
        rel_set.add(module, subdir)

    def find_release_set(self, release_set):
        for rel_set in self.release_sets:
            if rel_set.name == release_set:
                return rel_set
        return None

    def fetch_versions(self, all_versions):
        for rel_set in self.release_sets:
            rel_set.fetch_versions(all_versions)

    def get_all_modules(self):
        res = []
        for rel_set in self.release_sets:
            res.extend(rel_set.get_all_modules())
        return res

    def get_str(self):
        res = ''
        for rel_set in self.release_sets:
            res += rel_set.get_str()
        return res


def get_release(tarball_conversion):
    ''' We take all packages under <whitelist> that have a non-empty 'set'
        attribute.
        Interesting examples:
        <package name="libwnck-2" module="libwnck" limit="2.90" set="core"/>
        <package name="seed"               subdir="js"     set="bindings" limit="2.33"/>
    '''
    rel = Release()

    root = ET.parse(tarball_conversion).getroot()
    for whitelist in root.findall('whitelist'):
        for package in whitelist.findall('package'):
            release_set = package.get('set')
            if not release_set:
                continue
            module = package.get('module') or package.get('name')
            limit = package.get('limit') or None
            subdir = package.get('subdir')

            mod = Module(module, limit)
            rel.add(release_set, mod, subdir)

    return rel


def fetch_all_versions(json_dir):
    ''' Get all versions for all modules installed on ftp.gnome.org, based on
        the json file.
    '''
    all_versions = {}

    for child in os.listdir(json_dir):
        if not os.path.isfile(os.path.join(json_dir, child)):
            continue

        if not child[-5:] == '.json':
            continue

        module = child[:-5]
        json_file = os.path.join(json_dir, child)

        if module in BLACKLISTED_SOURCES:
            continue

        j = json.load(open(json_file, 'rb'))
        json_format = j[0]
        if json_format != JSON_FORMAT:
            print('Format of cache.json for \'%s\' is %s while we support \'%s\'.' % (module, json_format, JSON_FORMAT), file=sys.stderr)
            continue

        json_format, json_info, json_versions, json_ignored = j

        versions = json_versions[urllib.parse.unquote(module)]
        versions.sort()

        if not versions:
            continue

        all_versions[urllib.parse.unquote(module)] = versions

    return all_versions


def get_extras_limit(module, release, stable_version):
    # Workaround https://bugzilla.gnome.org/show_bug.cgi?id=649331
    if module == 'dia':
        return '1'
    if not stable_version:
        return None

    if stable_version in STABLE_BRANCHES_LIMITS:
        limits = STABLE_BRANCHES_LIMITS[stable_version]
        if module in limits:
            return limits[module]

    if not release:
        return None
    if module not in STABLE_BRANCH_SAME_LIMITS:
        return None

    stable_module = STABLE_BRANCH_SAME_LIMITS[module]
    modules = release.get_all_modules()
    for m in modules:
        if m.name == stable_module:
            return m.limit

    print('Cannot find limit for \'%s\': no module \'%s\' in moduleset.' % (module, stable_module), file=sys.stderr)

    return None


def get_extras_versions(all_versions, release, stable_version):
    ''' Get the latest version of all modules (except the ones already in
        release), as well as the latest versions for all limits configured in
        the BRANCH_LIMITS variable for those modules. '''
    if release:
        modules_in_release = [ x.name for x in release.get_all_modules() ]
    else:
        modules_in_release = []

    res = []

    for (module, versions) in list(all_versions.items()):
        if module not in modules_in_release:
            limit = get_extras_limit(module, release, stable_version)
            latest = _get_latest_version(versions, limit)
            if latest:
                res.append((module, latest))

        if module in BRANCH_LIMITS:
            limits_module = BRANCH_LIMITS[module]
            if type(limits_module) == str:
                latest = _get_latest_version(versions, limits_module)
                if latest:
                    res.append((module, latest))
            elif type(limits_module) == tuple:
                for limit in limits_module:
                    latest = _get_latest_version(versions, limit)
                    if latest:
                        res.append((module, latest))
            else:
                print('Unknown limit format \'%s\' for \'%s\'.' % (limits_module, module), file=sys.stderr)

    return res


def main(args):
    parser = optparse.OptionParser()
    parser.add_option("-s", "--stable-version", dest="stable_version",
                      help="stable branch to consider", metavar="VERSION")
    parser.add_option("-c", "--conversion-config", dest="conversion_config",
                      help="tarball-conversion config file", metavar="FILE")
    parser.add_option("-d", "--output-dir", dest="output_dir",
                      help="output dir", metavar="DIR")
    parser.add_option("-j", "--json-dir", dest="json_dir",
                      help="JSON cache dir", metavar="DIR")

    (options, args) = parser.parse_args()

    versions_all = []
    packages_in_conversion_config = []

    release = None
    all_versions = None

    if options.conversion_config is not None:
        if not os.path.exists(options.conversion_config):
            print('tarball-conversion config file \'%s\' does not exist.' % options.conversion_config, file=sys.stderr)
            return 1
        try:
            release = get_release(options.conversion_config)
        except SyntaxError as e:
            print('Cannot parse tarball-conversion config file \'%s\': %s' % (options.conversion_config, e), file=sys.stderr)
            return 1
        if len(release.get_all_modules()) == 0:
            print('Parsing tarball-conversion config file \'%s\' resulted in no module in release sets.' % options.conversion_config, file=sys.stderr)
            return 1

    if options.stable_version and options.stable_version not in STABLE_BRANCHES_LIMITS:
        print('No defined limits for stable version \'%s\'.' % options.stable_version, file=sys.stderr)

    if options.json_dir is None:
        print('JSON cache directory must be specified.' % options.json_dir, file=sys.stderr)
        return 1
    if not os.path.exists(options.json_dir) or not os.path.isdir(options.json_dir):
        print('JSON cache directory \'%s\' is not a directory.' % options.json_dir, file=sys.stderr)
        return 1

    all_versions = fetch_all_versions(options.json_dir)
    if release is not None:
        release.fetch_versions(all_versions)
    extras_versions = get_extras_versions(all_versions, release, options.stable_version)
    extras_versions.sort()

    if options.output_dir is None:
        output_dir = '.'
    else:
        output_dir = options.output_dir
        if os.path.exists(output_dir):
            if not os.path.isdir(output_dir):
                print('Output directory \'%s\' is not a directory.' % output_dir, file=sys.stderr)
                return 1
        else:
            safe_mkdir_p(output_dir)

    if release is not None:
        out = open(os.path.join(output_dir, 'versions'), 'w')
        out.write(release.get_str())
        out.close()

    out = open(os.path.join(output_dir, 'versions-extras'), 'w')
    out.write('## EXTRAS\n')
    for (module, version) in extras_versions:
        #out.write('%s:%s:%s:\n' % ('extras', module, version))
        out.write('%s:%s:%s:\n' % ('fgo', module, version))
    out.close()

    return 0

if __name__ == '__main__':
    try:
      ret = main(sys.argv)
      sys.exit(ret)
    except KeyboardInterrupt:
      pass
0707010000002F000081ED0000000000000000000000016548EB8C0000131C000000000000000000000000000000000000004A00000000osc-plugin-collab-0.104+30/server/upstream/gnome-versions/update-versions#!/bin/sh

VERBOSE=0

TMPDIR=$(mktemp -d)

if test $# -ne 1; then
  echo "Usage: $(basename $0) DEST-DIR"
  exit 1
fi

DESTDIR=$1

GENERATEVERSIONS=$(readlink -f $(dirname $0))/generate-versions

die_if_error () {
	if test $? -ne 0; then
		if test "x$1" != "x"; then
			echo $1
		else
			echo "Unknown error"
		fi
                rm -rf $TMPDIR
		exit 1
	fi
}

echo_verbose () {
        if test $VERBOSE -ne 0; then
                echo "$*"
        fi
}


DOWNLOAD_STABLE="`curl --tlsv1 --silent --fail https://download.gnome.org/core/ | grep 'a href=".*/"' | sed 's/.*href="//g;s/\/".*//g' | grep -P "^(3\.|4)" | sort -g | tail -n 1`"
#TEMPORARY_STABLE="41"

if test -z "$DOWNLOAD_STABLE"; then
  echo "Cannot find stable release from download.gnome.org."
  exit 1
fi

if test -n "$TEMPORARY_STABLE" -a "x$DOWNLOAD_STABLE" = "x$TEMPORARY_STABLE"; then
	echo "TEMPORARY_STABLE hack can be removed"
fi

if test -n "$TEMPORARY_STABLE"; then
	STABLE="$TEMPORARY_STABLE"
else
	STABLE="$DOWNLOAD_STABLE"
fi

STABLE_MAJOR=`echo $STABLE | sed "s/\(^[0-9]\+\.\).*/\1/g"`

UNSTABLE="$(echo $STABLE_MAJOR +1 | bc)"

echo_verbose "Stable: $STABLE - Unstable: $UNSTABLE"

mkdir -p $DESTDIR
die_if_error "Error while creating destination directory"

cd $TMPDIR
die_if_error "Cannot change directory to $TMPDIR"

if test -z "$GNOME_OFFLINE"; then
  curl --tlsv1 --silent --show-error --output $TMPDIR/sources.html                     https://download.gnome.org/sources/
  die_if_error "Error while downloading list of sources"

  if test -d $TMPDIR/json-cache; then
    rm -f $TMPDIR/json-cache/*
    rmdir $TMPDIR/json-cache
  fi

  if test -e $TMPDIR/json-cache; then
    echo "JSON cache directory still exists."
    exit 1
  fi

  mkdir $TMPDIR/json-cache
  die_if_error "Error while creating JSON cache directory"

  for dir in $(cat $TMPDIR/sources.html | grep 'a href=".*/"' | sed 's/.*href="//g;s/".*//g'); do
    module=${dir%%/}
    if test "$dir" == "$module" -o "$dir" == "../"; then
      continue
    fi
    for try in 1 2 3; do
      # --fail/-f: do not ouput HTTP 40x error pages
      # --location/-L: follow redirects
      curl --tlsv1 --silent --fail --location https://download.gnome.org/sources/$module/cache.json > $TMPDIR/json-cache/$module.json
      test $? -eq 0 -o $? -eq 22 && break

      if test $try -eq 3; then
	echo "Cannot download cache.json for $module"
	exit 1
      fi

      sleep 3
    done
  done

  curl --tlsv1 --silent --show-error --output $TMPDIR/tarball-conversion.config        https://gitlab.gnome.org/GNOME/releng/raw/master/tools/smoketesting/tarball-conversion.config
  die_if_error "Error while downloading tarball-conversion.config"
  curl --tlsv1 --silent --show-error --output $TMPDIR/tarball-conversion-stable.config https://gitlab.gnome.org/GNOME/releng/raw/master/tools/smoketesting/tarball-conversion-${STABLE/./-}.config
  die_if_error "Error while downloading tarball-conversion-stable.config"
fi

echo_verbose "Generating stable versions..."
$GENERATEVERSIONS --json-dir=$TMPDIR/json-cache --output-dir=$TMPDIR --conversion-config=$TMPDIR/tarball-conversion-stable.config --stable-version=$STABLE
die_if_error "Error while creating stable versions"
mv $TMPDIR/versions $DESTDIR/gnome-$STABLE
die_if_error "Error while moving stable versions"
cp -f $DESTDIR/gnome-$STABLE $DESTDIR/gnome-stable
die_if_error "Error while copying the stable versions"
mv $TMPDIR/versions-extras $DESTDIR/gnome-$STABLE-extras
die_if_error "Error while moving stable extras versions"
cp -f $DESTDIR/gnome-$STABLE-extras $DESTDIR/gnome-stable-extras
die_if_error "Error while copying the stable extras versions"

echo_verbose "Generating unstable versions..."
$GENERATEVERSIONS --json-dir=$TMPDIR/json-cache --output-dir=$TMPDIR --conversion-config=$TMPDIR/tarball-conversion.config
die_if_error "Error while creating unstable versions"
mv $TMPDIR/versions $DESTDIR/gnome-$UNSTABLE
die_if_error "Error while moving unstable versions"
cp -f $DESTDIR/gnome-$UNSTABLE $DESTDIR/gnome-unstable
die_if_error "Error while copying the unstable versions"
mv $TMPDIR/versions-extras $DESTDIR/gnome-$UNSTABLE-extras
die_if_error "Error while moving unstable extras versions"
cp -f $DESTDIR/gnome-$UNSTABLE-extras $DESTDIR/gnome-unstable-extras
die_if_error "Error while copying the unstable extras versions"

rm -rf $TMPDIR

# To update a versions file for an old stable version:
# - Get the tarball-conversion-stable.config from git, when the stable version
#   was still the old stable version and put it in ~/local/share/
# - Then:
#   cd ~/local/tmp
#   export VERSION=2.28
#   ~/local/bin/generate-versions --output-dir=/home/users/vuntz/local/tmp/ --conversion-config=/home/users/vuntz/local/share/tarball-conversion-$VERSION.config
#   mv versions ~/public_html/tmp/versions/versions-$VERSION
#   mv versions-extras ~/public_html/tmp/versions/versions-$VERSION-extras
07070100000030000081ED0000000000000000000000016548EB8C00002789000000000000000000000000000000000000003100000000osc-plugin-collab-0.104+30/server/upstream/runme#!/bin/sh
# vim: set ts=4 sw=4 et:

#
# Copyright (c) 2008-2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

basedir=`dirname $0`

## Options
# What's the current GNOME version in Factory
# Note: when moving to unstable, also remove the unneeded limits in upstream-limits.txt
GNOME_FACTORY_VERSION=stable


## Basic setup

CACHE_DIR=./cache
CONFIG_FILE=
LOG_FILE=

usage() {
    echo "Usage: $0 [-o CONF-FILE] [-l LOG-FILE]"
    echo ""
    echo "Options:"
    echo "   -o CONF-FILE     Use CONF-FILE as configuration file"
    echo "   -l LOG-FILE      Use LOG-FILE to log errors"
}

while getopts o:l:h option; do
    case $option in
    o) CONFIG_FILE=$OPTARG;;
    l) LOG_FILE=$OPTARG;;
    h|help) usage; exit 0;;
    *) usage; exit 1;;
    esac
done

if test "x$CONFIG_FILE" != "x"; then
    if test ! -f $CONFIG_FILE; then
        echo >&2 "Configuration file $CONFIG_FILE does not exit."
        exit 1
    else
        OBS_OPTIONS_CACHE_DIR=`grep "^ *cache-dir =" $CONFIG_FILE | sed "s/.*= *\(.*\) *$/\1/g" | tail -n 1`
        test "x$OBS_OPTIONS_CACHE_DIR" != "x" && CACHE_DIR=$OBS_OPTIONS_CACHE_DIR
    fi
fi

mkdir -p $CACHE_DIR

##############################################################
# Download latest upstream versions
# For non-GNOME:Factory, we only care about the official GNOME modules.

concatenate_all_versions () {
    DESTFILE=$CACHE_DIR/upstream/latest

    rm -f $DESTFILE.new

    for file in $CACHE_DIR/upstream/gnome-$GNOME_FACTORY_VERSION \
                $CACHE_DIR/upstream/gnome-$GNOME_FACTORY_VERSION-extras \
                $CACHE_DIR/upstream/upstream; do
        if test -f $file; then
            cat $file >> $DESTFILE.new
        fi
    done

    if test $? -ne 0; then
        echo "Error while creating the merged latest upstream versions file"
        return 1
    fi

    # we do everything above in a temporary file so that errors are safely
    # ignored, and so that we can compare the result (and keep the old file
    # with the old mtime if there's no change)
    cmp --quiet $DESTFILE.new $DESTFILE
    if test $? -ne 0; then
        mv $DESTFILE.new $DESTFILE
    else
        rm -f $DESTFILE.new
    fi
}

download_gnome_version () {
    VERSION=$1
    if test "x$1" = "x"; then
        return 1
    fi

    DESTFILE=$CACHE_DIR/upstream/gnome-$VERSION
    rm -f $DESTFILE.new

    wget -q -nc -O $DESTFILE.new http://www.gnome.org/~vuntz/tmp/versions/versions-$VERSION

    if test $? -ne 0; then
        echo "Error while checking for new GNOME upstream versions ($VERSION)"
        return 1
    fi

    # Don't use gstreamer from ftp.gnome.org -- it can be outdated
    sed -i "s/^\(desktop:gst-plugins.*\)$/# \1/g;s/^\(desktop:gstreamer:.*\)$/# \1/g" $DESTFILE.new
    # We don't care about mobile stuff
    sed -i "s/^\(mobile:.*\)$/# \1/g" $DESTFILE.new
    # Let's name the group fgo, instead of core, apps, extras, etc.
    sed -i "s/^[^#:][^:]*:/fgo:/g" $DESTFILE.new

    cmp --quiet $DESTFILE.new $DESTFILE
    if test $? -ne 0; then
        mv $DESTFILE.new $DESTFILE
    else
        rm -f $DESTFILE.new
    fi
}

download_cpan_version () {
    DESTFILE=$CACHE_DIR/upstream/cpan
    rm -f $DESTFILE.new

    # -a will keep the mtime
    test -f $DESTFILE && cp -a $DESTFILE $DESTFILE.new

    LOG_OPTION=
    if test "x$LOG_FILE" != "x"; then
        LOG_OPTION="--log $LOG_FILE"
    fi

    $basedir/download-cpan-versions $LOG_OPTION \
        --save-file=$DESTFILE.new \
        --only-if-old
    RETVAL=$?

    if test $RETVAL -eq 2; then
        # No update was done (old file was not old enough)
        rm -f $DESTFILE.new
        return 2
    fi

    if test $RETVAL -ne 0; then
        echo "Error while checking for new upstream versions on CPAN"
        rm -f $DESTFILE.new
        return 1
    fi

    sort -u $DESTFILE.new > $DESTFILE.new.sorted
    mv $DESTFILE.new.sorted $DESTFILE.new

    cmp --quiet $DESTFILE.new $DESTFILE
    if test $? -ne 0; then
        mv $DESTFILE.new $DESTFILE
    else
        rm -f $DESTFILE.new
    fi
}

download_pypi_version () {
    DESTFILE=$CACHE_DIR/upstream/pypi
    rm -f $DESTFILE.new

    # -a will keep the mtime
    test -f $DESTFILE && cp -a $DESTFILE $DESTFILE.new

    LOG_OPTION=
    if test "x$LOG_FILE" != "x"; then
        LOG_OPTION="--log $LOG_FILE"
    fi

    $basedir/download-pypi-versions $LOG_OPTION \
        --save-file=$DESTFILE.new \
        --only-if-old
    RETVAL=$?

    if test $RETVAL -eq 2; then
        # No update was done (old file was not old enough)
        rm -f $DESTFILE.new
        return 2
    fi

    if test $RETVAL -ne 0; then
        echo "Error while checking for new upstream versions on pypi"
        rm -f $DESTFILE.new
        return 1
    fi

    sort -u $DESTFILE.new > $DESTFILE.new.sorted
    mv $DESTFILE.new.sorted $DESTFILE.new

    cmp --quiet $DESTFILE.new $DESTFILE
    if test $? -ne 0; then
        mv $DESTFILE.new $DESTFILE
    else
        rm -f $DESTFILE.new
    fi
}

download_fallback_version () {
    DESTFILE=$CACHE_DIR/upstream/fallback
    rm -f $DESTFILE.new

    # -a will keep the mtime
    test -f $DESTFILE && cp -a $DESTFILE $DESTFILE.new

    LOG_OPTION=
    if test "x$LOG_FILE" != "x"; then
        LOG_OPTION="--log $LOG_FILE"
    fi

    $basedir/download-fallback-versions $LOG_OPTION \
        --save-file=$DESTFILE.new
    RETVAL=$?

    if test $RETVAL -eq 2; then
        # No update was done (old file was not old enough)
        rm -f $DESTFILE.new
        return 2
    fi

    if test $RETVAL -ne 0; then
        echo "Error while checking for fallback of new upstream versions"
        rm -f $DESTFILE.new
        return 1
    fi

    cmp --quiet $DESTFILE.new $DESTFILE
    if test $? -ne 0; then
        mv $DESTFILE.new $DESTFILE
    else
        rm -f $DESTFILE.new
    fi
}

download_upstream_version () {
    DESTFILE=$CACHE_DIR/upstream/upstream

    # -a will keep the mtime
    test -f $DESTFILE && cp -a $DESTFILE $DESTFILE.new

    LOG_OPTION=
    if test "x$LOG_FILE" != "x"; then
        LOG_OPTION="--log $LOG_FILE"
    fi

    $basedir/download-upstream-versions $LOG_OPTION \
        --upstream-tarballs=$basedir/upstream-tarballs.txt \
        --upstream-limits=$basedir/upstream-limits.txt \
        --save-file=$DESTFILE.new \
        --only-if-old --use-old-as-fallback
    RETVAL=$?

    if test $RETVAL -eq 2; then
        # No update was done (old file was not old enough)
        rm -f $DESTFILE.new
        return 2
    fi

    if test $RETVAL -ne 0; then
        echo "Error while checking for new upstream versions"
        rm -f $DESTFILE.new
        return 1
    fi

    cmp --quiet $DESTFILE.new $DESTFILE
    if test $? -ne 0; then
        mv $DESTFILE.new $DESTFILE
    else
        rm -f $DESTFILE.new
    fi
}

mkdir -p $CACHE_DIR/status
mkdir -p $CACHE_DIR/upstream

## Discontinued
# download_gnome_version 2.26
# download_gnome_version 2.28
# download_gnome_version 2.30
# download_gnome_version 2.32
# download_gnome_version 3.0
# download_gnome_version 3.2
# download_gnome_version 3.4
# download_gnome_version 3.6
# download_gnome_version 3.6-extras
# download_gnome_version 3.8
# download_gnome_version 3.8-extras

#download_gnome_version 3.12
#download_gnome_version 3.12-extras
# Disabled because of infrastructure change on GNOME servers
#download_gnome_version stable
#download_gnome_version unstable
#download_gnome_version stable-extras
#download_gnome_version unstable-extras

# Do this once, before the slow step
concatenate_all_versions

download_cpan_version
download_pypi_version
download_fallback_version

download_upstream_version
if test $? -eq 0; then
    concatenate_all_versions
fi

# Check that we have everything in the match database; we only do this once per
# day to avoid sending mails every X minutes.
MATCH_CHECK_TIMESTAMP=0
MATCH_CHECK_FILE="$CACHE_DIR/status/upstream-match-check"
if test -f "$MATCH_CHECK_FILE"; then
    MATCH_CHECK_TIMESTAMP=`stat --format="%Y" "$MATCH_CHECK_FILE"`
    MATCH_CHECK_TIMESTAMP=`echo "$MATCH_CHECK_TIMESTAMP + 24 * 3600" | bc`
fi
if test "$MATCH_CHECK_TIMESTAMP" -lt "`date +%s`"; then
    for i in `grep -v '^#' $CACHE_DIR/upstream/latest | grep ':' | cut -d ':' -f 2`; do
        re_i=`echo $i | sed 's/\+/\\\\\\+/g'`
        grep -q -E "^(# ?)?$re_i[:|]" $basedir/upstream-packages-match.txt
        if test $? -ne 0; then
            echo $i not in $basedir/upstream-packages-match.txt
        fi
    done
    echo "Last check for upstream match database completeness: `date --rfc-3339=seconds`" > "$MATCH_CHECK_FILE"
fi
07070100000031000081A40000000000000000000000016548EB8C000007E9000000000000000000000000000000000000003F00000000osc-plugin-collab-0.104+30/server/upstream/upstream-limits.txt# This file is used to express limits we want to have on new upstream tarballs.
#
# It's possible to decide to:
#
#  + not have unstable versions (for modules following the x.y.z release
#    scheme, where it's unstable when y is odd)
#    Use the no-odd-unstable instruction for this behavior.
#
#  + not have a version greater or equal than a specified version.
#    Use the "max|x.y" instruction for this behavior.
#
#  + not have specific versions.
#    Use the "skip|x.y;a.b;..." instruction for this behavior
#

# LaTeXPlugin has some tarballs with a date instead of version
LaTeXPlugin:max|2000
# bash-completion has some tarballs with a date instead of version
bash-completion:max|2008
# ht has some tarballs with a date instead of version
ht:max|2000
# iptables has a 030513 tarball...
iptables:max|10000
# libflashsupport has some tarballs with a date instead of version
libflashsupport:max|2000
# libofx used 0.11 as 0.1.1...
libofx:max|0.10
xf86-video-mga:skip|1.9.99;1.9.100

# we don't want unstable versions of the following modules
gq:no-odd-unstable
swfdec:no-odd-unstable
swfdec-mozilla:no-odd-unstable

# branches
geoclue|1.0:max|1.90
gobby|0.4:max|0.4.90
gst-plugins-bad|0.10:max|0.11.0
gst-plugins-base|0.10:max|0.11.0
gst-plugins-good|0.10:max|0.11.0
gst-plugins-ugly|0.10:max|0.11.0
gst-python|0.10:max|0.11.0
gstreamer|0.10:max|0.11.0
udisks|1.90:max|1.90
webkitgtk|2.4:max|2.5

## At least for next release
#PackageKit:no-odd-unstable
#dbus:no-odd-unstable
#libdmapsharing:no-odd-unstable
#liferea:no-odd-unstable
## Modules that follow a six-months cycle, and will be released too late for
## next release
#clutter:no-odd-unstable
#deja-dup:no-odd-unstable
#folks:no-odd-unstable
#gssdp:no-odd-unstable
#gupnp:no-odd-unstable
#gupnp-av:no-odd-unstable
#gwibber:no-odd-unstable
#libgdata:no-odd-unstable
#pixman:no-odd-unstable
#telepathy-gabble:no-odd-unstable
#telepathy-glib:no-odd-unstable
#telepathy-mission-control:no-odd-unstable
#telepathy-sofiasip:no-odd-unstable
#webkitgtk:no-odd-unstable
07070100000032000081A40000000000000000000000016548EB8C00005CA7000000000000000000000000000000000000004700000000osc-plugin-collab-0.104+30/server/upstream/upstream-packages-match.txt# Format of this file:
# upstreamname:packagename
# If packagename is the same as upstreamname, then you can leave it empty
# If tracking a specific branch of upstream, then upstreamname should look like
# this: "realupstreamname|branchname". For example: gobby|0.4

# Please keep the list alphabetically sorted

BuildStream:buildstream
Coherence:python-coherence
ConsoleKit:
CouchDB:python-couchdb
DeviceKit-disks:
DeviceKit-power:
DeviceKit:
GConf:gconf2
Glib:perl-Glib
Gtk2:perl-Gtk2
LaTeXPlugin:gedit-latex-plugin
LibRaw:libraw
ModemManager:
NetworkManager-iodine:
NetworkManager-openconnect:
NetworkManager-openswan:
NetworkManager-openvpn:
NetworkManager-pptp:
NetworkManager-strongswan:
NetworkManager-vpnc:
NetworkManager:
ORBit2:orbit2
ORBit:orbit
PackageKit:
PackageKit-Qt:
PackageKit-Qt:PackageKit-Qt5
Pango:perl-Pango
PolicyKit-gnome:
QtCurve-Gtk2:qtcurve-gtk2
QtCurve-Gtk3:qtcurve-gtk3
UPnP-Inspector:upnp-inspector
abiword-docs:
abiword:
accerciser:
accountsservice:
acpid:
adwaita-icon-theme:
aegisub:
aisleriot:
alacarte:
alarm-clock-applet:
almanah:
anjuta-extras:
anjuta:
anthy:
apache-couchdb:couchdb
appdata-tools:
appres:
appstream-glib:
aqbanking:
arista:
arping:arping2
asio:
at-spi2-atk:
at-spi2-core:
at-spi:
atheme-services:atheme
atk:
atkmm:
atomix:
audiofile:
autocutsel:
autofs:
avahi:
avahi:avahi-glib2
avahi:avahi-mono
avahi:avahi-qt4
babl:
bakefile:
bakery:
balsa:
banshee-community-extensions:
banshee:
baobab:
bash-completion:
bdftopcf:
beagle-xesam:
beagle:
beforelight:
bfsync:
bindfs:
bigboard:
bijiben:
bitmap:
blktrace:
blueproximity:
bombermaze:
bot-sentry:
brasero:
brltty:
btrfs-progs:btrfsprogs
bug-buddy:
byzanz:
c++-gtk-utils:
c-ares:libcares2
cachefilesd:
cairo-clock:
cairo-compmgr:
cairo:
cairomm:
california:
calls:
cantarell-fonts:
caribou:
ccgfs:
ccsm:compizconfig-settings-manager
cdecl:
cdfs:
check:
cheese:
cherrytree:
chrome-gnome-shell:
chmlib:
chmsee:
claws-mail-extra-plugins:
claws-mail:
cloop:
clutter-gst:
clutter-gtk:
clutter:
cmsfs:
cmuclmtk:
cogl:
colorblind:
colord:
colord-gtk:
comix:
compiz-bcop:
compiz-fusion-plugins-extra:
compiz-fusion-plugins-main:
compiz-fusion-plugins-unsupported:
compiz:
compizconfig-backend-gconf:libcompizconfig-backend-gconf
compizconfig-backend-kconfig:libcompizconfig-backend-kconfig
compizconfig-python:python-compizconfig
computertemp:
conduit:
conglomerate:
conntrack-tools:
cromfs:
csmash:
cups-pk-helper:
davfs2:
d-feet:
dasher:
dbus-glib:dbus-1-glib
dbus:dbus-1
dconf:
dconf-editor:
dd_rescue:
ddrescue:gnu_ddrescue
decibel-audio-player:
dee:
deja-dup:
deskbar-applet:
desktop-data-model:
desktop-file-utils:
desktop-translations:
desktopcouch:python-desktopcouch
devhelp:
devilspie:
dia:
diffutils:
ding:
djvulibre:
dleyna-server:
dmapi:
dmlangsel:
docky:
dogtail:
dosfstools:
drwright:
dwarves:
e2fsprogs:
easytag:
ed:
editres:
eds-feed:evolution-galago
eel:
efax-gtk:
eiciel:
ekiga:
emacs:
emerald:compiz-emerald
emerillon:
empathy:
enchant:
eog-plugins:
eog:
epiphany-extensions:
epiphany:
esound:
espeak-gui:
evince:
evolution-data-server:
evolution-ews:
evolution-exchange:
evolution-groupwise:
evolution-mapi:
evolution-rss:
evolution-sharp:
evolution-tray:
evolution-webcal:
evolution:
exempi:
exfat-utils:
exfatprogs:
exiv2:
f-spot:
farsight2:
farsight:
farstream:
fcitx:
fcitx-anthy:
fcitx-chewing:
fcitx-cloudpinyin:
fcitx-configtool:
fcitx-fbterm:
fcitx-googlepinyin:
fcitx-hangul:
fcitx-libpinyin:
fcitx-m17n:
fcitx-rime:
fcitx-sayura:
fcitx-sunpinyin:
fcitx-table-extra:
fcitx-table-other:
fcitx-ui-light:
fcitx-unikey:
file-roller:
file-shrimp:
fillmore-lombard:
five-or-more:
flickrapi:python-flickrapi
folks:
font-util:
fonttosfnt:
four-in-a-row:
freerdp:FreeRDP
freetype:freetype2
frei0r-plugins:
frogr:
fslsfonts:
fstobdf:
fuse:fuse3
fusefs:exfat:fuse-exfat
fyre:
g-wrap:
gail:
gaim-galago:
galago-daemon:
galago-gtk-sharp:
galago-sharp:
gbrainy:
gcab:
gcalctool:
gccmakedep:
gcin:
gconf-editor:
gconfmm:
gcr:
gcstar:
gdata.py:python-gdata
gdesklets:gDesklets
gdk-pixbuf:
gdl:
gdlmm:
gdm:
gdome2:
geany-plugins:
geany:
geary:
gedit-code-assistance:
gedit-collaboration:
gedit-cossa:
gedit-latex:
gedit-plugins:
gedit:
geeqie:
gegl:
genius:
geoclue:geoclue2
geoclue|1.0:geoclue
geocode-glib:
gexiv2:
gfbgraph:
gftp:
ggz-client-libs:
ghex:
gi-docgen:python-gi-docgen
giggle:
gimp-dds:
gimp-gap:
gimp-help:
gimp-lqr-plugin:
gimp-save-for-web:
gimp:
gir-repository:
girl:
git:
gitg:
giver:
gjs:
gkrellm:
glabels:
glade3:
glade:
glew:
glib-networking:
glib-openssl:
glib:glib2
glib|1.3:glib
glibmm:glibmm2
glipper:
glitz:
glom:
gmetadom:
gmime:
gnac:
gnet:
gnokii:
gnome-2048:
gnome-activity-journal:
gnome-applets:
gnome-audio:
gnome-autoar:
gnome-backgrounds:
gnome-battery-bench:
gnome-blog:
gnome-bluetooth:
gnome-boxes:
gnome-build:
gnome-builder:
gnome-calculator:
gnome-calendar:
gnome-characters:
gnome-chess:
gnome-clocks:
gnome-code-assistance:
gnome-color-chooser:
gnome-color-manager:
gnome-colors:gnome-colors-icon-theme
gnome-commander:
gnome-common:
gnome-connections:
gnome-contacts:
gnome-control-center:
gnome-desktop:gnome-desktop
gnome-desktop|2.90:gnome-desktop2
gnome-devel-docs:
gnome-dictionary:
gnome-directory-thumbnailer:
gnome-disk-utility:
gnome-do-plugins:
gnome-do:
gnome-doc-utils:
gnome-documents:
gnome-dvb-daemon:
gnome-epub-thumbnailer:
gnome-font-viewer:
gnome-games-extra-data:
gnome-games:
gnome-getting-started-docs:
gnome-gmail-notifier:
gnome-gmail:
gnome-icon-theme-extras:
gnome-icon-theme-symbolic:
gnome-icon-theme:
gnome-initial-setup:
gnome-internet-radio-locator:
gnome-js-common:
gnome-keyring-sharp:
gnome-keyring:
gnome-kiosk:
gnome-klotski:
gnome-libs:
gnome-logs:
gnome-mag:
gnome-mahjongg:
gnome-main-menu:
gnome-maps:
gnome-media:
gnome-menus:
gnome-menus|3.1:gnome-menus-legacy
gnome-mime-data:
gnome-mines:
gnome-mount:
gnome-multi-writer:
gnome-music:
gnome-netstatus:
gnome-nettool:
gnome-news:
gnome-nibbles:
gnome-online-accounts:
gnome-online-miners:
gnome-packagekit:
gnome-panel:
gnome-phone-manager:
gnome-photos:
gnome-pilot-conduits:
gnome-pilot:
gnome-power-manager:
gnome-presence-applet:
gnome-python-desktop:
gnome-python-extras:python-gnome-extras
gnome-python:python-gnome
gnome-radio:
gnome-recipes:
gnome-remote-desktop:
gnome-reset:
gnome-robots:
gnome-schedule:
gnome-screensaver:
gnome-screenshot:
gnome-search-tool:
gnome-session:
gnome-settings-daemon:
gnome-sharp:gnome-sharp2
gnome-shell-extensions:
gnome-shell:
gnome-software:
gnome-sound-recorder:
gnome-speech:
gnome-spell:gnome-spell2
gnome-subtitles:
gnome-sudoku:
gnome-system-log:
gnome-system-monitor:
gnome-taquin:
gnome-terminal:
gnome-tetravex:
gnome-text-editor:
gnome-themes-extras:
gnome-themes-standard:
gnome-themes:
gnome-todo:
gnome-tour:
gnome-tweak-tool:
gnome-tweaks:
gnome-usage:
gnome-user-docs:
gnome-user-share:
gnome-utils:
gnome-vfs-monikers:
gnome-vfs-obexftp:
gnome-vfs:gnome-vfs2
gnome-vfsmm:
gnome-video-effects:
gnome-weather:
gnome-web-photo:
gnomeicu:
gnonlin:
gnopernicus:
gnote:
gnucash-docs:
gnucash:
gnumeric:
gnupg:gpg2
gob2:
gobby:
gobby|0.4:gobby04
gobject-introspection:
gocr:
goffice:
goffice|0.9:goffice-0_8
gok:
gom:
gonvert:
goobox:
goocanvas:
goocanvas|1.90:goocanvas1
goocanvasmm:
google-gadgets-for-linux:google-gadgets
gourmet:
gpa:
gparted:
gpaste:
gpgme:
gpick:
gpodder:
gq:
gqview:
gramps:
grilo-plugins:
grilo:
grisbi:
gromit:
gsettings-desktop-schemas:
gsound:
gspell:
gssdp:
gst-plugins-bad:gstreamer-plugins-bad
gst-plugins-bad|0.10:gstreamer-0_10-plugins-bad
gst-plugins-base:gstreamer-plugins-base
gst-plugins-base|0.10:gstreamer-0_10-plugins-base
gst-plugins-farsight:gstreamer-0_10-plugins-farsight
gst-plugins-gl:gstreamer-0_10-plugins-gl
gst-plugins-good:gstreamer-plugins-good
gst-plugins-good|0.10:gstreamer-0_10-plugins-good
gst-plugins-ugly:gstreamer-plugins-ugly
gst-plugins-ugly|0.10:gstreamer-0_10-plugins-ugly
gst-python:python-gstreamer
gst-python:python3-gstreamer
gst-python|0.10:python-gstreamer-0_10
gst-rtsp:
gstreamer:
gstreamer:gstreamer-doc
gstreamer|0.10:gstreamer-0_10
gstreamer|0.10:gstreamer-0_10-doc
gstreamer-editing-services:
gsynaptics:
gtef:
gtetrinet:
gtg:
gthumb:
gtk+|1.3:gtk
gtk+|2.90:gtk2
gtk+|3.89:gtk3
gtk:gtk4
gtk-doc:
gtk-engines-cleanice:gtk2-engine-cleanice
gtk-engines|2.90:gtk2-engines
gtk-recordmydesktop:gtk-recordMyDesktop
gtk-sharp:gtk-sharp2
gtk-vnc:
gtk-vnc:gtk-vnc2
gtkglext:
gtkhotkey:
gtkhtml:
gtkimageview:
gtkmathview:
gtkmm-documentation:
gtkmm-documentation|2.90:gtkmm2-documentation
gtkmm:gtkmm4
gtkmm|3.89:gtkmm3
gtkmm|2.90:gtkmm2
gtkpbbuttons:
gtkpod:
gtksourceview|1.9:gtksourceview18
gtksourceview|2.90:gtksourceview2
gtksourceview|3.90:gtksourceview
gtksourceview|4.90:gtksourceview4
gtksourceview:gtksourceview5
gtksourceviewmm:
gtksourceviewmm|2.90:gtksourceviewmm2
gtkspell:
gtranslator:
guake:
gucharmap:
gupnp-av:
gupnp-dlna:
gupnp-igd:
gupnp-tools:
gupnp-ui:
gupnp:
gurlchecker:
gvfs:
gwenhywfar:
gwget:
gwibber:
gypsy:
gzip:
harfbuzz:
hamster-applet:
hamster-time-tracker:
hicolor-icon-theme:
hippo-canvas:
hitori:
hotssh:
ht:
hxtools:
hyena:
iagno:
ibus:
ibus-anthy:
ibus-chewing:
ibus-gjs:
ibus-googlepinyin:
ibus-hangul:
ibus-input-pad:
ibus-m17n:
ibus-pinyin:
ibus-qt:
ibus-rime:
ibus-sunpinyin:
ibus-table:
ibus-table-chinese:
ibus-table-extraphrase:
ibus-table-jyutping:
ibus-table-others:
ibus-table-zhengma:
ibus-table-zhuyin:
ibus-table-ziranma:
ibus-unikey:
iceauth:
ico:
icon-naming-utils:
iio-sensor-proxy:
ima-evm-utils:
imake:
inkscape:
input-pad:
intel-gpu-tools:
intltool:
json-glib:
ipod-sharp:
iputils:
iproute2:
iptables:
iptraf-ng:iptraf
irda-utils:irda
iso-codes:
istanbul:
itstool:
iwatch:
jhbuild:
json-glib:
jsonrpc-glib:
kcm-fcitx:
kimtoy:
krb5-auth-dialog:
kye:
kyotocabinet:
lasem:
latexila:
lbxproxy:
ldtp:
libFS:
libHX:
libICE:
libIDL:libidl
libSM:
libWindowsWM:
libX11:
libXScrnSaver:
libXTrap:
libXau:
libXaw:
libXcomposite:
libXcursor:
libXdamage:
libXdmcp:
libXevie:
libXext:
libXfixes:
libXfont:
libXfontcache:
libXft:
libXi:
libXinerama:
libXmu:
libXp:
libXpm:
libXprintAppUtil:
libXprintUtil:
libXrandr:
libXrender:
libXres:
libXt:
libXtst:
libXv:
libXvMC:
libXxf86dga:
libXxf86misc:
libXxf86vm:
libadwaita:
libao-pulse:
libart_lgpl:
libassuan:
libatasmart:
libatomic_ops:
libbeagle:
libbonobo:
libbonoboui:
libbraille:
libbs2b:
libbtctl:
libcanberra:
libchamplain:
libchewing:
libcompizconfig:
libcroco:
libcryptui:
libdaemon:
libdatrie:
libdmapsharing:
libdmx:
libdrm:
libdv:
libebml:
libedit:
libepc:
libepoxy:
libesmtp:
libfontenc:
libgadu:
libgail-gnome:
libgalago-gtk:
libgalago:
libgames-support:
libgcrypt:
libgda:
libgda|3.99:libgda3
libgdamm:
libgdata:
libdazzle:
libgee:
libgexiv2:
libggz:ggz
libghttp:
libgit2:
libgit2-glib:
libglade:libglade2
libglademm:
libgnome-keyring:
libgnome-media-profiles:
libgnome:
libgnomecanvas:
libgnomecanvasmm:
libgnomecups:
libgnomedb:
libgnomedb|3.99:libgnomedb3
libgnomekbd:
libgnomemm:
libgnomeprint:
libgnomeprintui:
libgnomesu:
libgnomeui:
libgnomeuimm:
libgooglepinyin:
libgovirt:
libgpg-error:
libgpod:
libgrss:
libgsasl:
libgsf:
libgssh:
libgsystem:
libgtkhtml:
libgtksourceviewmm:
libgtop:
libgudev:
libgusb:
libgweather:
libgxps:
libhandy:
libhangul:
libical:
libical-glib:
libinfinity:
libiptcdata:
libjingle:
libksba:
liblbxutil:
liblouis:
liblouis:python-louis
libmatroska:
libmbim:
libmcs:
libmediaart:
libmnl:
libmodman:
libmowgli:
libnma:
libnetfilter_conntrack:
libnetfilter_log:
libnetfilter_queue:
libnfnetlink:
libnice:
libnjb:
libnotify:
libofetion:
libofx:
liboil:
liboldX:
libopenraw:
libosinfo:
libpanelappletmm:
libpciaccess:
libpeas:
libpinyin:
libplist:
libproxy:
libproxy:libproxy-plugins
libpst:
libpwquality:
libqmi:
librep:
librime:
librsvg:
libsecret:
libsexy:
libsigc++:libsigc++3
libsigc++|2.99:libsigc++2
libsigc++|1.3:libsigc++12
libslab:
libsocialweb:
libsoup:
libspectre:
libtasn1:
libtelepathy:
libthai:
libturpial:
libunique:
libunique|2:libunique1
libvirt-cim:
libvirt-glib:
libvirt:
libvpx:
libwacom:
libwebp:
libwnck:
libwnck|2.90:libwnck2
libxcb:
libxkbcommon:
libxkbfile:
libxkbui:
libxklavier:
libxml++:
libxml:
libzapojit:
libzeitgeist:
liferea:
lightsoff:
link-grammar:
listres:
lmms:
lndir:
loudmouth:
m4:
m17n-contrib:
m17n-db:
m17n-lib:
mail-notification:
makedepend:
mangler:
md5deep:
media-explorer:
media-player-info:
meld:
memphis:
memprof:
mergeant:
metacity:
metatheme-Sonar:gtk2-metatheme-sonar
metatheme-gilouche:gtk2-metatheme-gilouche
mercurial:
mkcomposecache:
mkfontdir:
mkfontscale:
mkvtoolnix:
moc:
mobile-broadband-provider-info:
mod_dnssd:apache2-mod_dnssd
monsoon:
moserial:
mousetweaks:
mozilla-bonobo:
mbpurple:
mrim-prpl:pidgin-mrim
mtr:
muine:
murrine:gtk2-engine-murrine
mutter:
mutter-wayland:
mx:
nautilus-actions:
nautilus-cd-burner:
nautilus-open-terminal:
nautilus-python:python-nautilus
nautilus-search-tool:
nautilus-sendto:
nautilus-share:
nautilus-terminal:
nautilus:
nemiver:
neon:
net6:
netspeed_applet:gnome-netspeed-applet
network-manager-applet:NetworkManager-applet
nfs-utils:
nimbus:gtk2-metatheme-nimbus
njb-sharp:
notification-daemon:
notify-osd:
notify-python:python-notify
npth:
nspluginwrapper:
nss-mdns:
ntfs-3g_ntfsprogs:ntfs-3g
ntfs-3g_ntfsprogs:ntfs-3g_ntfsprogs
ntfs-config:
nuntius:
obby:
obex-data-server:
oclock:
onboard:
online-desktop:
opal:
opencc:
openfetion:
openobex:
opensuse-font-fifth-leg:fifth-leg-font
opus:
orc:
orca:
ori:
osm-gps-map:
ostree:
p11-kit:
padevchooser:
pam_mount:
paman:
pango:
pangomm:
pangomm|2.47:pangomm1_4
pangox-compat:
paprefs:
papyon:
pavucontrol:
pavuk:
pavumeter:
pcre:
pdfmod:
pessulus:
phodav:
pidgin-advanced-sound-notification:
pidgin-birthday-reminder:
pidgin-embeddedvideo:
pidgin-facebookchat:
pidgin-guifications:
pidgin-openfetion:
pidgin-otr:
pidgin-sipe:
pidgin:
pinentry:
pino:
pinpoint:
pipewire:
pithos:
pitivi:
pixman:
pkg-config:
planner:
polari:
polkit-gnome:
polkit:
poppler-data:
poppler:
posixovl:
postr:
powerpc-utils:
ppc64-diag:
presage:
procmeter3:procmeter
proxymngr:
psiconv:
psmisc:
ptlib:libpt2
pulseaudio:
purple-plugin-pack:
py2cairo:python-cairo
pyatspi:python-atspi
pyatspi:python3-atspi
pycairo:python3-cairo
pycups:python-cups
pygobject:python-gobject
pygobject:python3-gobject
pygobject|2.29:python-gobject2
pygobject|2.29:python3-gobject2
pygoocanvas:python-goocanvas
pygtk:python-gtk
pygtkglext:python-gtkglext
pygtksourceview:python-gtksourceview
pymetar:python-pymetar
pymsn:python-msn
pyorbit:python-orbit
pypoppler:
pysmbc:python-smbc
python-distutils-extra:
python-espeak:
python-xlib:
pywebkitgtk:python-webkitgtk
pyxdg:python-xdg
qiv:
qmmp:
quadrapassel:
quilt:
radiotray:
raptor:
rarian:
rasqal:
raw-thumbnailer:
recordmydesktop:
redland:
rednotebook:
remmina:Remmina
rendercheck:
rep-gtk:
rest:librest
retro-gtk:
rgb:
rhythmbox:
rlwrap:
rstart:
rygel:
sabayon:
sawfish:
schismtracker:
schroedinger:
scim:
scim-anthy:
scim-bridge:
scim-canna:
scim-chewing:
scim-hangul:
scim-input-pad:
scim-m17n:
scim-pinyin:
scim-qtimm:
scim-skk:
scim-sunpinyin:
scim-tables:
scim-tomoe:
scim-uim:
scim-unikey:
scrollkeeper:
scummvm:
seahorse-nautilus:
seahorse-plugins:
seahorse-sharing:
seahorse:
seed:
seed:seed2
scripts|xorg:xorg-scripts
sessreg:
setxkbmap:
shared-color-profiles:
shared-color-targets:
shared-desktop-ontologies:
shared-mime-info:
shotwell:
showfont:
simple-ccsm:
simple-scan:
smproxy:
smuxi:
snappy:snappy-player
sobby:
sofia-sip:
solang:
sound-juicer:
sound-theme-freedesktop:
sparkleshare:
specto:
speech-dispatcher:
spheres-and-crystals:gtk2-metatheme-spheres-and-crystals
spice-gtk:
spice-protocol:
spice:
ssh-contact:
sshfp:
startup-notification:
stk:
sunpinyin:
sushi:
swell-foop:
swfdec-gnome:
swfdec-mozilla:
swfdec:
swig:
synapse:
sysprof:
system-config-printer:
tali:
tangerine:
tango-icon-theme:
tasks:
tasque:
telegnome:
telepathy-butterfly:
telepathy-farsight:
telepathy-farstream:
telepathy-gabble:
telepathy-glib:
telepathy-haze:
telepathy-idle:
telepathy-logger:
telepathy-mission-control:
telepathy-python:python-telepathy
telepathy-rakia:
telepathy-salut:
telepathy-sofiasip:
telepathy-stream-engine:
template-glib:
tepl:
the-board:
tig:
tilda:
tinyproxy:
tokyocabinet:
tomboy:
totem-pl-parser:
totem:
tracker:
tracker-miners:
traffic-vis:
transmageddon:
transmission:
tsclient:
turpial:
twitux:
twm:
udisks:udisks2
udisks|1.90:udisks
uget:
uhttpmock:
ulogd:ulogd2
unico:
update-desktop-files:
upower:
usbredir:
usbview:
vala:
vala:vala-unstable
vala|0.13:vala-0_12
valencia:
varnish:
vboxgtk:
viewres:
vim:
vinagre:
vino:
virtkey:python-virtkey
vmfs-tools:
vobject:
vte:
vte|0.29:vte2
wadptr:
weather-wallpaper:
webkitgtk|2.4:libwebkit
webkitgtk|2.4:libwebkit3
webkitgtk|2.4:webkitgtk
webkitgtk|2.4:webkitgtk3
webkitgtk:webkit2gtk3
wmakerconf:
x-tile:
x11perf:
x11vnc:
xauth:
xbacklight:
xbiff:
xbitmaps:
xcalc:
xcb-util-image:
xcb-util-keysyms:
xcb-util-renderutil:
xcb-util-wm:
xcb-util:
xchat-gnome:
xchat:
xclipboard:
xclock:
xcmsdb:
xcompmgr:
xconsole:
xcursor-themes:
xcursorgen:
xdbedizzy:
xdelta:
xdg-app:
xdg-desktop-portal:
xdg-desktop-portal-gnome:
xdg-desktop-portal-gtk:
xdg-user-dirs-gtk:
xdg-user-dirs:
xdg-utils:
xditview:
xdm:
xdpyinfo:
xedit:
xev:
xeyes:
xf86dga:
xf86-input-evdev:
xf86-input-joystick:
xf86-input-keyboard:
xf86-input-libinput:
xf86-input-mouse:
xf86-input-synaptics:
xf86-input-vmmouse:
xf86-input-void:
xf86-input-wacom:
xf86-video-ark:
xf86-video-ast:
xf86-video-ati:
xf86-video-cirrus:
xf86-video-dummy:
xf86-video-fbdev:
xf86-video-geode:
xf86-video-glint:
xf86-video-i128:
xf86-video-intel:
xf86-video-ivtv:xorg-x11-driver-video-ivtv
xf86-video-mach64:
xf86-video-mga:
xf86-video-neomagic:
xf86-video-newport:
xf86-video-nv:
xf86-video-qxl:
xf86-video-r128:
xf86-video-radeonhd:xorg-x11-driver-video-radeonhd
xf86-video-savage:
xf86-video-siliconmotion:
xf86-video-sis:
xf86-video-tdfx:
xf86-video-tga:
xf86-video-trident:
xf86-video-v4l:
xf86-video-vesa:
xf86-video-vmware:
xf86-video-voodoo:
xfd:
xfindproxy:
xfontsel:
xfs:
xfsdump:
xfsinfo:
xfsprogs:
xfwp:
xgamma:
xgc:
xhost:
xinit:
xinput:
xkbcomp:
xkbevd:
xkbprint:
xkbutils:
xkeyboard-config:
xkill:
xload:
xlogo:
xlsatoms:
xlsclients:
xlsfonts:
xmag:
xman:
xmessage:
xmh:
xmodmap:
xmore:
xorgxrdp:
xorg-cf-files:
xorg-docs:
xorg-sgml-doctools:
xosd:
xplsprinters:
xpr:
xprehashprinterlist:
xprop:
xrandr:
xrdb:
xrdp:
xrefresh:
xrestop:
xrx:
xsane:
xscope:
xset:
xsetmode:
xsetpointer:
xsetroot:
xsm:
xstdcmap:
xtables-addons:
xtrans:
xtrap:
xvidtune:
xvinfo:
xwd:
xwininfo:
xwud:
xzgv:
yaml-cpp:
yelp-tools:
yelp-xsl:
yelp:
zeitgeist-datahub:
zeitgeist:
zenity:
zim:
zlib:

##
## Packages with issues when tracking
##
## The way libnl is packaged is a bit too complex since it depends on the version
# libnl:

##
## Package with no upstream page anymore
##
# dopi:

##
## Packages where we are upstream
## It's not required to list -branding-{openSUSE,SLED} packages.
##
beagle-index:
build-compare:
bundle-lang-common:
bundle-lang-gnome:
bundle-lang-gnome-extras:
bundle-lang-kde:
bundle-lang-other:
desktop-data-openSUSE:
desktop-data-SLED:
dynamic-wallpapers-11x:
ggreeter:
gnome-patch-translation:
gnome-shell-search-provider-openSUSE-packages:
gos-wallpapers:
gtk2-themes:
libsolv:
libzypp:
libzypp-bindings:
libzypp-testsuite-tools:
metacity-themes:
opt_gnome-compat:
tennebon-dynamic-wallpaper:
translation-update:
yast2:
yast2-control-center-gnome:
zypp-plugin:
zypper:

##
## Packages that we removed
##
# clutter-cairo:
# fast-user-switch-applet:
# gnome-cups-manager:
# gnome-volume-manager:
# gst-pulse:gstreamer-0_10-pulse
# last-exit:
# libssui:
# libsvg:
# libsvg-cairo:
## This is the old mission-control. Not needed anymore (unless we want to package the 4.x branch)
# mission-control:telepathy-mission-control
# pysqlite:python-sqlite2

## TODO (should get packaged):
#librsvgmm:

## Stuff on ftp.gnome.org we don't handle
#GConf-dbus:
#GSAPI:
#GnomeHello:
#Guppi:
#abi:
#abispell:
#accountsdialog:
#acme:
#alleyoop:
#ammonite:
#anjal:
#aravis:
#at-poke:
#banter:
#battfink:
#billreminder:
#blam:
#bonobo:
#bonobo-activation:
#bonobo-conf:
#bonobo-config:
#cairo-java:
#camorama:
#capuchin:
#capuchin-glib:
#chronojump:
#clutter-box2dmm:
#clutter-cairomm:
#clutter-gtkmm:
#cluttermm:
#cluttermm_tutorial:
#contacts:
#control-center:
#control-center-plus:
#couchdb-glib:
#crux:
#dates:
#deskscribe:
#discident-glib:
#dots:
#dryad:
#ease:
#easytag:
#ee:
#eggcups:
#evolution-activesync:
#evolution-caldav:
#evolution-couchdb:
#evolution-data-server-dbus:
#evolution-jescs:
#evolution-kolab:
#evolution-scalix:
#firestarter:
#fontilus:
#fpm:
#g-print:
#gASQL:
#gDesklets:
#gabber:
#gal:
#galeon:
#gamin:
#garnome:
#gazpacho:
#gb:
#gedit2:
#gegl-vala:
#geglmm:
#gevice:
#gfax:
#gfloppy:
#gget:
#ggv:
#gide:
#gio-standalone:
#glibwww:
#glimmer:
#glom-postgresql-setup:
#gmdns:
#gmf:
#gmime:
#gnome-admin:
#gnome-boxes-nonfree:
#gnome-braille:
#gnome-chart:
#gnome-core:
#gnome-crash:
#gnome-db:
#gnome-debug:
#gnome-desktop-testing:
#gnome-file-selector:
#gnome-getting-started-docs:
#gnome-gpg:
#gnome-guile:
#gnome-hello:
#gnome-initial-setup:
#gnome-jabber:
#gnome-keyring-manager:
#gnome-launch-box:
#gnome-linuxconf:
#gnome-lirc-properties:
#gnome-lokkit:
#gnome-mud:
#gnome-nds-thumbnailer:
#gnome-netinfo:
#gnome-network:
#gnome-objc:
#gnome-perfmeter:
#gnome-pim:
#gnome-print:
#gnome-specimen:
#gnome-vfs-extras:
#gnome-video-arcade:
#gnome-xcf-thumbnailer:
#gnome2-user-docs:
#gnome_js_common:
#gnome_speech:
#gnomemeeting:
#gnomemm:
#gnomemm-all:
#gnomemm_hello:
#gnomoku:
#gnorpm:
#gnotepad+:
#gob:
#googlizer:
#gopersist:
#gpdf:
#gst-plugins:
#gstreamermm:
#gswitchit-plugins:
#gswitchit_plugins:
#gtk-css-engine:
#gtk-mac-bundler:
#gtk-mac-integration:
#gtk-theme-engine-clearlooks:
#gtk-thinice-engine:
#gtkglarea:
#gtkglextmm:
#gtkmm_hello:
#gtkmozedit:
#gtkmozembedmm:
#gtm:
#gtop:
#gturing:
#guile-gobject:
#gupnp-vala:
#guppi:
#gwt-glom:
#gxml:
#gyrus:
#hipo:
#imlib:
#iogrind:
#jamboree:
#java-access-bridge:
#java-atk-wrapper:
#java-gnome:
#java-libglom:
#kbdraw:
#kiwi:
#libPropList:
#libbonobomm:
#libbonobouimm:
#libcapplet:
#libccc:
#libcm:
#libeds-java:
#libgda-uimm:
#libgnetwork:
#libgnome2:
#libgnomecanvas2:
#libgnomecompat2:
#libgnomedbmm:
#libgnomefilesel:
#libgnomeprintmm:
#libgnomeprintuimm:
#libgnomeui2:
#libgtcpsocket:
#libgtkhtml-java:
#libgtkmozembed-java:
#libgtkmusic:
#libmrproject:
#libnotifymm:
#libole2:
#libqmi:
#libunicode:
#libvte-java:
#libvtemm:
#libxml2:
#libxslt:
#libzvt:
#libzvt2:
#linc:
#linux-user-chroot:
#lock-service:
#longomatch:
#loudmouth-ruby:
#lsr:
#magicdev:
#mc:
#medusa:
#mess-desktop-entries:
#metatheme:
#mlview:
#model:
#mrproject:
#msitools:
#nanny:
#nautilus-gtkhtml:
#nautilus-image-converter:
#nautilus-media:
#nautilus-mozilla:
#nautilus-rpm:
#network-manager-netbook:
#oaf:
#ocrfeeder:
#office-runner:
#ontv:
#opengl-glib:
#orbit-python:
#orbitcpp:
#pan:
#panelmm:
#paperbox:
#passepartout:
#pkgconfig:
#pong:
#postr:
#prefixsuffix:
#present:
#printman:
#procman:
#pwlib:
#pybliographer:
#pygda:
#pygi:
#pygtk2reference:
#pyphany:
#quick-lounge-applet:
#radioactive:
#regexxer:
#rep-gtk-gnome2:
#rygel-gst-0-10-fullscreen-renderer:
#rygel-gst-0-10-media-engine:
#rygel-gst-0-10-plugins:
#sapwood:
#sawfish-gnome2:
#scaffold:
#siobhan:
#snowy:
#sodipodi:
#soup:
#straw:
#strongwind:
#system-tools-backends:
#system-tray-applet:
#telegnome:
#themus:
#toutdoux:
#trilobite:
#ttf-bitstream-vera:
#update-manager:
#users-guide:
#xalf:
#ximian-connector:
#ximian-setup-tools:
#xml-i18n-tools:
## Added to ftp.gnome.org recently
#glick2:

# Other gnome-related upstreams with no package in openSUSE
#gnome-desktop-sharp:
#gnome-system-tools:
#liboobs:
#mm-common:
#glib-java:
#libgconf-java:
#libglade-java:
#libgnome-java:
#libgtk-java:
#atomix:
#gnome-scan:
#gossip:
#labyrinth:
## perl bindings
#Gnome2-Canvas:
#Gnome2-GConf:
#Gnome2-VFS:
#Gnome2:
#Gtk2-GladeXML:
07070100000033000081A40000000000000000000000016548EB8C0000C871000000000000000000000000000000000000004100000000osc-plugin-collab-0.104+30/server/upstream/upstream-tarballs.txt# Format of this file:
#   name:method:info
#
# where:
#   + name is the upstream module name found in tarball names
#     (it can contain a branch name after the '|' character, for example:
#     'gobby|0.4')
#   + method is one of 'upstream', 'ftpls', 'httpls', 'dualhttpls',
#     'subdirhttpls', 'sf', 'google', 'lp'
#   + info depends on method
#
# upstream:
#   + use this when openSUSE is upstream for the package
#   + the info field is ignored
#
# ftpls:
#   + use this when you only have a ftp directory listing all tarballs
#   + the info field should be the URL of the ftp directory
#
# httpls:
#   + use this when you only have a single web page listing all tarballs
#     Note that we support real HTML pages (and not just 'listing' pages)
#     and that a link to the last tarball on this page is generally enough.
#   + the info field should be the URL of the web page
#
# dualhttpls:
#   + use this when there are two web pages listing all tarballs (usually
#     happens when there's a releases directory and a snapshots directory)
#   + the info field should be both URL separated by a pipe
#
# subdirhttpls:
#   + use this when there are subdirectories to browse find the latest version
#     The subdirectories should always be made of version numbers only
#   + the info field should be the root URL containing all those subdirectories
#
# svnls:
#   + use this when you only have a svn server listening on port 80 listing all
#     tarballs. An example is:
#     https://svn.revolutionlinux.com/MILLE/XTERM/trunk/libflashsupport/Tarballs/
#   + the info field should be the URL of the svn directory
#
# sf:
#   + use this when the upstream tarballs are hosted on sourceforge
#   + the info field should be the project id in sourceforge
#     It can also contain the path for sourceforge projects using
#     packages. In this case, the ids should be separated by a pipe:
#     project_id|path
#   + Project ID can be found by going to http://sourceforge.net/rest/p/<project_name>?doap
#     (i.e. http://sourceforge.net/rest/p/swig?doap), and searching for the <sf:id> field
#
# sf_jp:
#   + use this when the upstream tarballs are hosted on sourceforge.jp
#   + the info field should be the project in sourceforge.jp
#
# google:
#   + use this when the upstream tarballs are hosted on code.google.com
#   + the info field should be the project name on code.google.com. It can
#     also contain the name of tarballs if it's different from the project
#     name. In this case, the names should be separated by a pipe:
#     project_name|tarball_name
#
# lp:
#   + use this when the upstream tarballs are hosted on launchpad
#   + the info field should be the project name on launchpad
#
# trac:
#   + use this when the upstream tarballs are hosted on trac, with a download page
#   + the info field should be the URL of the trac download page
#
# Note that you need to add a line in upstream-packages-match.txt for each
# module you add here. The line should go in the "Non ftp.gnome.org stuff"
# section or in the "Packages where we are upstream" section.

# Please keep the list alphabetically sorted

Coherence:httpls:http://coherence.beebits.net/download/
ConsoleKit:httpls:http://www.freedesktop.org/software/ConsoleKit/dist/
CouchDB:httpls:http://pypi.python.org/pypi/CouchDB
DeviceKit-disks:httpls:http://hal.freedesktop.org/releases/
DeviceKit-power:httpls:http://upower.freedesktop.org/releases/
DeviceKit:httpls:http://hal.freedesktop.org/releases/
LaTeXPlugin:sf:204144
LibRaw:httpls:http://www.libraw.org/download
NetworkManager-strongswan:httpls:http://download.strongswan.org/NetworkManager/
PackageKit:httpls:http://www.freedesktop.org/software/PackageKit/releases/
PackageKit-Qt:httpls:http://www.freedesktop.org/software/PackageKit/releases/
PolicyKit-gnome:httpls:http://hal.freedesktop.org/releases/
QtCurve-Gtk2:httpls:http://www.kde-look.org/content/download.php?content=40492&id=3
UPnP-Inspector:httpls:http://coherence.beebits.net/download/
abiword-docs:httpls:http://abisource.com/downloads/abiword/latest/source/
abiword:httpls:http://abisource.com/downloads/abiword/latest/source/
accountsservice:httpls:http://www.freedesktop.org/software/accountsservice/
acpid:httpls:http://tedfelix.com/linux/acpid-netlink.html
aegisub:httpls:http://ftp.aegisub.org/pub/releases/
alarm-clock-applet:lp:alarm-clock
anthy:sf_jp:anthy
apache-couchdb:httpls:http://couchdb.apache.org/downloads.html
appdata-tools:httpls:http://people.freedesktop.org/~hughsient/releases/
appres:httpls:http://xorg.freedesktop.org/releases/individual/app/
appstream-glib:httpls:http://people.freedesktop.org/~hughsient/appstream-glib/releases/
aqbanking:httpls:http://www.aquamaniac.de/sites/download/packages.php?showall=1
arista:httpls:http://www.transcoder.org/downloads/
arping:httpls:http://www.habets.pp.se/synscan/files/
asio:sf:122478|asio
atheme-services:httpls:http://atheme.net/downloads/
autocutsel:httpls:http://download.savannah.gnu.org/releases/autocutsel/
autofs:httpls:https://kernel.org/pub/linux/daemons/autofs/v5/
avahi:httpls:http://avahi.org/download/
babl:subdirhttpls:http://ftp.gtk.org/pub/babl/
bakefile:sf:83016|bakefile
balsa:httpls:http://pawsa.fedorapeople.org/balsa/download.html
banshee-community-extensions:subdirhttpls:http://download.banshee.fm/banshee-community-extensions/
bash-completion:httpls:http://bash-completion.alioth.debian.org/files/
bdftopcf:httpls:http://xorg.freedesktop.org/releases/individual/app/
beforelight:httpls:http://xorg.freedesktop.org/releases/individual/app/
bfsync:httpls:http://space.twc.de/~stefan/bfsync/
bindfs:httpls:http://bindfs.org/downloads/
bitmap:httpls:http://xorg.freedesktop.org/releases/individual/app/
blktrace:httpls:http://brick.kernel.dk/snaps/
blueproximity:sf:203022
bombermaze:sf:8614|bombermaze
bot-sentry:sf:156021|bot-sentry
brltty:httpls:http://mielke.cc/brltty/releases/
c++-gtk-utils:sf:277143|cxx-gtk-utils
c-ares:httpls:http://c-ares.haxx.se/download/
cachefilesd:httpls:http://people.redhat.com/~dhowells/fscache/
cairo-clock:httpls:http://macslow.net/?page_id=23
cairo-compmgr:httpls:http://download.tuxfamily.org/ccm/cairo-compmgr/
cairo:dualhttpls:http://cairographics.org/snapshots/|http://cairographics.org/releases/
cairomm:dualhttpls:http://cairographics.org/snapshots/|http://cairographics.org/releases/
ccgfs:sf:207310
ccsm:subdirhttpls:http://releases.compiz.org/components/ccsm/
cdecl:httpls:http://www.gtlib.cc.gatech.edu/pub/Linux/devel/lang/c/
cdfs:httpls:https://users.elis.ugent.be/~mronsse/cdfs/download/
check:sf:28255|check
cherrytree:httpls:http://www.giuspen.com/software/
chmlib:httpls:http://www.jedrea.com/chmlib/
chmsee:google:chmsee
claws-mail-extra-plugins:sf:25528|extra plugins
claws-mail:sf:25528|Claws Mail
cloop:httpls:http://debian-knoppix.alioth.debian.org/packages/cloop/
cmsfs:httpls:http://www.linuxvm.org/Patches
cmuclmtk:sf:1904|cmuclmtk
colorblind:httpls:https://alioth.debian.org/frs/?group_id=31117
colord:httpls:http://www.freedesktop.org/software/colord/releases/
colord-gtk:httpls:http://www.freedesktop.org/software/colord/releases/
comix:sf:146377
compiz-bcop:subdirhttpls:http://releases.compiz.org/components/compiz-bcop/
compiz-fusion-plugins-extra:subdirhttpls:http://releases.compiz.org/components/plugins-extra/
compiz-fusion-plugins-main:subdirhttpls:http://releases.compiz.org/components/plugins-main/
compiz-fusion-plugins-unsupported:subdirhttpls:http://releases.compiz.org/components/plugins-unsupported/
compiz:subdirhttpls:http://releases.compiz.org/core/
compizconfig-backend-gconf:subdirhttpls:http://releases.compiz.org/components/compizconfig-backend-gconf/
compizconfig-backend-kconfig:subdirhttpls:http://releases.compiz.org/components/compizconfig-backend-kconfig/
compizconfig-python:subdirhttpls:http://releases.compiz.org/components/compizconfig-python/
computertemp:httpls:http://computertemp.berlios.de/download.php
conglomerate:sf:82766|Conglomerate XML Editor
conntrack-tools:httpls:http://ftp.netfilter.org/pub/conntrack-tools/
cromfs:httpls:http://bisqwit.iki.fi/source/cromfs.html
csmash:sf:4179|CannonSmash
cups-pk-helper:httpls:http://www.freedesktop.org/software/cups-pk-helper/releases/
davfs2:httpls:http://download.savannah.gnu.org/releases/davfs2/
dbus-glib:httpls:http://dbus.freedesktop.org/releases/dbus-glib/
dbus:httpls:http://dbus.freedesktop.org/releases/dbus/
dd_rescue:httpls:http://garloff.de/kurt/linux/ddrescue/
ddrescue:httpls:http://ftp.gnu.org/gnu/ddrescue/
decibel-audio-player:httpls:http://decibel.silent-blade.org/index.php?n=Main.Download
dee:lp:dee
deja-dup:lp:deja-dup
desktop-file-utils:httpls:http://www.freedesktop.org/software/desktop-file-utils/releases/
desktopcouch:lp:desktopcouch
devilspie:httpls:http://www.burtonini.com/computing/
diffutils:httpls:http://ftp.gnu.org/gnu/diffutils/
ding:httpls:http://ftp.tu-chemnitz.de/pub/Local/urz/ding/
djvulibre:sf:32953|DjVuLibre
dmapi:ftpls:ftp://oss.sgi.com/projects/xfs/cmd_tars/
dmlangsel:google:loolixbodes|dmlangsel
docky:lp:docky
dwarves:httpls:http://fedorapeople.org/~acme/dwarves/
ed:httpls:http://ftp.gnu.org/gnu/ed/
editres:httpls:http://xorg.freedesktop.org/releases/individual/app/
eds-feed:httpls:http://www.galago-project.org/files/releases/source/eds-feed/
efax-gtk:sf:109982|efax-gtk
eiciel:httpls:http://rofi.roger-ferrer.org/eiciel/download/
emacs:ftpls:ftp://ftp.gnu.org/gnu/emacs/
emerald:subdirhttpls:http://releases.compiz.org/components/emerald/
enchant:subdirhttpls:http://www.abisource.com/downloads/enchant/
espeak-gui:lp:espeak-gui
evolution-rss:httpls:http://gnome.eu.org/index.php/Evolution_RSS_Reader_Plugin
evolution-tray:httpls:http://gnome.eu.org/index.php/Evolution_Tray
exempi:httpls:http://libopenraw.freedesktop.org/download/
exiv2:httpls:http://www.exiv2.org/download.html
farsight2:httpls:http://farsight.freedesktop.org/releases/farsight2/
farsight:httpls:http://farsight.freedesktop.org/releases/obsolete/farsight/
farstream:httpls:http://freedesktop.org/software/farstream/releases/farstream/
fcitx:google:fcitx
fcitx-anthy:google:fcitx|fcitx-anthy
fcitx-chewing:google:fcitx|fcitx-chewing
fcitx-cloudpinyin:google:fcitx|fcitx-cloudpinyin
fcitx-configtool:google:fcitx|fcitx-configtool
fcitx-fbterm:google:fcitx|fcitx-fbterm
fcitx-googlepinyin:google:fcitx|fcitx-googlepinyin
fcitx-hangul:google:fcitx|fcitx-hangul
fcitx-libpinyin:google:fcitx|fcitx-libpinyin
fcitx-m17n:google:fcitx|fcitx-m17n
fcitx-rime:google:fcitx|fcitx-rime
fcitx-sayura:google:fcitx|fcitx-sayura
fcitx-sunpinyin:google:fcitx|fcitx-sunpinyin
fcitx-table-extra:google:fcitx|fcitx-table-extra
fcitx-table-other:google:fcitx|fcitx-table-other
fcitx-ui-light:google:fcitx|fcitx-ui-light
fcitx-unikey:google:fcitx|fcitx-unikey
file-shrimp:google:loolixbodes|file-shrimp
fillmore-lombard:subdirhttpls:http://yorba.org/download/media/
flickrapi:httpls:http://pypi.python.org/pypi/flickrapi
font-util:httpls:http://xorg.freedesktop.org/releases/individual/font/
fonttosfnt:httpls:http://xorg.freedesktop.org/releases/individual/app/
freerdp:httpls:https://github.com/FreeRDP/FreeRDP/releases
freetype:httpls:http://download.savannah.gnu.org/releases/freetype/
frei0r-plugins:httpls:http://www.piksel.no/frei0r/releases/
fslsfonts:httpls:http://xorg.freedesktop.org/releases/individual/app/
fstobdf:httpls:http://xorg.freedesktop.org/releases/individual/app/
fyre:httpls:http://releases.navi.cx/fyre/
g-wrap:httpls:http://download.savannah.gnu.org/releases/g-wrap/
gaim-galago:httpls:http://www.galago-project.org/files/releases/source/gaim-galago/
galago-daemon:httpls:http://www.galago-project.org/files/releases/source/galago-daemon/
galago-gtk-sharp:httpls:http://www.galago-project.org/files/releases/source/galago-gtk-sharp/
galago-sharp:httpls:http://www.galago-project.org/files/releases/source/galago-sharp/
gbrainy:httpls:http://live.gnome.org/gbrainy
gccmakedep:httpls:http://xorg.freedesktop.org/releases/individual/util/
gcin:httpls:http://www.csie.nctu.edu.tw/~cp76/gcin/download/
gcstar:httpls:http://download.gna.org/gcstar/
gdata.py:google:gdata-python-client|gdata
gdesklets:httpls:http://gdesklets.de/
gdome2:httpls:http://gdome2.cs.unibo.it/tarball/
geany-plugins:httpls:http://plugins.geany.org/geany-plugins/
geany:httpls:http://download.geany.org/
geeqie:sf:222125
gegl:subdirhttpls:http://ftp.gtk.org/pub/gegl/
geoclue:httpls:http://www.freedesktop.org/wiki/Software/GeoClue
geoclue|1.0:httpls:http://people.freedesktop.org/~hadess/
gftp:httpls:http://gftp.seul.org/
ggz-client-libs:subdirhttpls:http://mirrors.ibiblio.org/ggzgamingzone/ggz/
gimp-dds:google:gimp-dds
gimp-help:ftpls:ftp://ftp.gimp.org/pub/gimp/help/
gimp-lqr-plugin:httpls:http://liquidrescale.wikidot.com/en:download-page-sources
gimp-save-for-web:httpls:http://registry.gimp.org/node/33
gimp:ftpls:ftp://ftp.gimp.org/pub/gimp/stable
git:google:git-core|git
giver:google:giver
gkrellm:httpls:http://members.dslextreme.com/users/billw/gkrellm/gkrellm.html
glabels:sf:46122|glabels
glew:sf:67586|glew
glipper:lp:glipper
glitz:dualhttpls:http://cairographics.org/snapshots/|http://cairographics.org/releases/
gmetadom:sf:40627|gmetadom
gnac:sf:193628|gnac
gnokii:httpls:http://www.gnokii.org/download/gnokii/
gnome-activity-journal:lp:gnome-activity-journal
gnome-color-chooser:sf:211146|gnome-color-chooser
gnome-colors:google:gnome-colors
gnome-do-plugins:lp:do-plugins
gnome-do:lp:do
gnome-gmail-notifier:google:gnome-gmail-notifier
gnome-gmail:sf:277145
gnome-keyring-sharp:httpls:http://download.mono-project.com/sources/gnome-keyring-sharp/
gnome-mount:httpls:http://hal.freedesktop.org/releases/
gnome-presence-applet:httpls:http://www.galago-project.org/files/releases/source/gnome-presence-applet/
gnome-schedule:sf:112183
gnome-subtitles:sf:129996
gnomeicu:sf:237
gnonlin:httpls:http://gstreamer.freedesktop.org/src/gnonlin/
# gnucash: Not updated anymore: gnucash:httpls:http://www.gnucash.org/pub/gnucash/sources/stable/
gnucash:sf:192|gnucash (stable)
# gnucash-docs: Not updated anymore: gnucash-docs:httpls:http://www.gnucash.org/pub/gnucash/sources/stable/
gnucash-docs:sf:192|gnucash-docs
gnupg:ftpls:ftp://ftp.gnupg.org/gcrypt/gnupg/
gobby:httpls:http://releases.0x539.de/gobby/
gobby|0.4:httpls:http://releases.0x539.de/gobby/
gocr:httpls:http://www-e.uni-magdeburg.de/jschulen/ocr/download.html
gonvert:httpls:http://www.unihedron.com/projects/gonvert/downloads/
google-gadgets-for-linux:google:google-gadgets-for-linux
gourmet:sf:108118
gpa:ftpls:ftp://ftp.gnupg.org/gcrypt/gpa/
gparted:sf:115843
gpaste:httpls:https://github.com/Keruspe/GPaste/downloads
gpgme:ftpls:ftp://ftp.gnupg.org/gcrypt/gpgme/
gpick:google:gpick
gpodder:httpls:http://gpodder.org/src/
gq:sf:3805|GQ LDAP Client
# gqview: note that we don't care about stable vs unstable
gqview:sf:4050
gramps:sf:25770
# grisbi: we really only want stable versions
grisbi:sf:93867|grisbi stable
gromit:httpls:http://www.home.unix-ag.org/simon/gromit/
gst-plugins-bad:httpls:http://gstreamer.freedesktop.org/src/gst-plugins-bad/
gst-plugins-bad|0.10:httpls:http://gstreamer.freedesktop.org/src/gst-plugins-bad/
gst-plugins-base:httpls:http://gstreamer.freedesktop.org/src/gst-plugins-base/
gst-plugins-base|0.10:httpls:http://gstreamer.freedesktop.org/src/gst-plugins-base/
gst-plugins-farsight:httpls:http://farsight.freedesktop.org/releases/obsolete/gst-plugins-farsight/
gst-plugins-gl:httpls:http://gstreamer.freedesktop.org/src/gst-plugins-gl/
gst-plugins-good:httpls:http://gstreamer.freedesktop.org/src/gst-plugins-good/
gst-plugins-good|0.10:httpls:http://gstreamer.freedesktop.org/src/gst-plugins-good/
gst-plugins-ugly:httpls:http://gstreamer.freedesktop.org/src/gst-plugins-ugly/
gst-plugins-ugly|0.10:httpls:http://gstreamer.freedesktop.org/src/gst-plugins-ugly/
gst-python:httpls:http://gstreamer.freedesktop.org/src/gst-python/
gst-python|0.10:httpls:http://gstreamer.freedesktop.org/src/gst-python/
gst-rtsp:httpls:http://gstreamer.freedesktop.org/src/gst-rtsp/
gstreamer:httpls:http://gstreamer.freedesktop.org/src/gstreamer/
gstreamer|0.10:httpls:http://gstreamer.freedesktop.org/src/gstreamer/
gstreamer-editing-services:httpls:http://gstreamer.freedesktop.org/src/gstreamer-editing-services/
gsynaptics:sf_jp:gsynaptics
gtg:lp:gtg
gtk-engines-cleanice:sf:57808|gtk-engines-cleanice
gtk-recordmydesktop:sf:172357|gtk-recordMyDesktop
gtkglext:sf:54333|gtkglext
gtkhotkey:lp:gtkhotkey
gtkimageview:httpls:http://trac.bjourne.webfactional.com/chrome/common/releases/
gtkmathview:httpls:http://helm.cs.unibo.it/mml-widget/sources/
gtkpbbuttons:sf:47862|gtkpbbuttons
gtkpod:sf:67873|gtkpod
gtkspell:sf:7896
guake:trac:http://guake.org/downloads
gurlchecker:httpls:http://labs.libre-entreprise.org/frs/?group_id=7
gwenhywfar:httpls:http://www.aquamaniac.de/sites/download/packages.php?showall=1
gwibber:lp:gwibber
gypsy:httpls:http://gypsy.freedesktop.org/releases/
gzip:httpls:http://ftp.gnu.org/gnu/gzip/
harfbuzz:httpls:http://www.freedesktop.org/software/harfbuzz/release/
hamster-time-tracker:httpls:https://github.com/projecthamster/hamster/tags
hicolor-icon-theme:httpls:http://icon-theme.freedesktop.org/releases/
ht:sf:1066
hxtools:httpls:http://jftp.inai.de/hxtools/
ibus:google:ibus
ibus-anthy:google:ibus|ibus-anthy
ibus-chewing:google:ibus|ibus-chewing
ibus-gjs:google:ibus|ibus-gjs
ibus-googlepinyin:google:libgooglepinyin|ibus-googlepinyin
ibus-hangul:google:ibus|ibus-hangul
ibus-input-pad:google:input-pad|ibus-input-pad
ibus-m17n:google:ibus|ibus-m17n
ibus-pinyin:google:ibus|ibus-pinyin
ibus-qt:google:ibus|ibus-qt
ibus-rime:google:rimeime|ibus-rime
ibus-sunpinyin:google:sunpinyin|ibus-sunpinyin
ibus-table:google:ibus|ibus-table
ibus-table-chinese:google:ibus|ibus-table-chinese
ibus-table-extraphrase:google:ibus|ibus-table-extraphrase
ibus-table-jyutping:google:ibus|ibus-table-jyutping
ibus-table-others:google:ibus|ibus-table-others
ibus-table-zhengma:google:ibus|ibus-table-zhengma
ibus-table-zhuyin:google:ibus|ibus-table-zhuyin
ibus-table-ziranma:google:ibus|ibus-table-ziranma
ibus-unikey:google:ibus-unikey
iceauth:httpls:http://xorg.freedesktop.org/releases/individual/app/
ico:httpls:http://xorg.freedesktop.org/releases/individual/app/
icon-naming-utils:httpls:http://tango.freedesktop.org/releases/
iio-sensor-proxy:httpls:https://github.com/hadess/iio-sensor-proxy/releases
imake:httpls:http://xorg.freedesktop.org/releases/individual/util/
inkscape:sf:93438|inkscape
input-pad:google:input-pad
intel-gpu-tools:httpls:http://xorg.freedesktop.org/releases/individual/app/
intltool:lp:intltool
ipod-sharp:httpls:http://download.banshee-project.org/ipod-sharp/
iproute2:httpls:http://kernel.org/pub/linux/utils/net/iproute2/
iptables:httpls:http://ftp.netfilter.org/pub/iptables/
iptraf-ng:httpls:https://fedorahosted.org/iptraf-ng/wiki/Download
irda-utils:sf:5616|irda-utils
istanbul:httpls:http://live.gnome.org/Istanbul
itstool:httpls:http://files.itstool.org/itstool/
iwatch:sf:174218|iwatch
kcm-fcitx:google:fcitx|kcm-fcitx
kimtoy:httpls:http://kde-apps.org/content/download.php?content=140967&id=1
kye:httpls:http://games.moria.org.uk/kye/download-install
kyotocabinet:httpls:http://fallabs.com/kyotocabinet/pkg/
lbxproxy:httpls:http://xorg.freedesktop.org/releases/individual/app/
ldtp:subdirhttpls:http://download.freedesktop.org/ldtp/
libFS:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libHX:sf:254041
libICE:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libSM:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libWindowsWM:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libX11:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXScrnSaver:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXTrap:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXau:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXaw:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXcomposite:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXcursor:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXdamage:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXdmcp:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXevie:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXext:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXfixes:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXfont:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXfontcache:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXft:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXi:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXinerama:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXmu:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXp:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXpm:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXprintAppUtil:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXprintUtil:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXrandr:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXrender:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXres:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXt:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXtst:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXv:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXvMC:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXxf86dga:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXxf86misc:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libXxf86vm:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libao-pulse:httpls:http://0pointer.de/lennart/projects/libao-pulse/
libassuan:ftpls:ftp://ftp.gnupg.org/gcrypt/libassuan/
libatasmart:httpls:http://0pointer.de/public/
libatomic_ops:httpls:http://www.ivmaisoft.com/_bin/atomic_ops/
libbraille:sf:17127|libbraille
libbs2b:sf:151236
libcanberra:httpls:http://0pointer.de/lennart/projects/libcanberra/
libchewing:google:chewing|libchewing
libcompizconfig:subdirhttpls:http://releases.compiz.org/components/libcompizconfig/
libdaemon:httpls:http://0pointer.de/lennart/projects/libdaemon/
libdatrie:httpls:http://linux.thai.net/~thep/datrie/datrie.html
libdmapsharing:httpls:http://flyn.org/projects/libdmapsharing/download.html
libdmx:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libdrm:httpls:http://dri.freedesktop.org/libdrm/
libdv:sf:4393|libdv
libebml:httpls:http://dl.matroska.org/downloads/libebml/
libedit:httpls:http://thrysoee.dk/editline/
libesmtp:httpls:http://www.stafford.uklinux.net/libesmtp/download.html
libfontenc:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libgadu:httpls:http://toxygen.net/libgadu/files/
libgalago-gtk:httpls:http://www.galago-project.org/files/releases/source/libgalago-gtk/
libgalago:httpls:http://www.galago-project.org/files/releases/source/libgalago/
libgcrypt:ftpls:ftp://ftp.gnupg.org/gcrypt/libgcrypt/
libgexiv2:subdirhttpls:http://yorba.org/download/gexiv2/
libggz:subdirhttpls:http://mirrors.ibiblio.org/ggzgamingzone/ggz/
libgit2:htpls:https://github.com/libgit2/libgit2/releases/
libgnomesu:httpls:http://members.chello.nl/~h.lai/libgnomesu/
libgooglepinyin:google:libgooglepinyin
libgpg-error:ftpls:ftp://ftp.gnupg.org/gcrypt/libgpg-error/
libgpod:sf:67873|libgpod
libgrss:httpls:http://gtk.mplat.es/libgrss/tarballs/
libgsasl:httpls:http://ftp.gnu.org/pub/gnu/gsasl/
libgusb:httpls:http://people.freedesktop.org/~hughsient/releases/
libhangul:google:libhangul
libical:sf:16077
libinfinity:httpls:http://releases.0x539.de/libinfinity/
libiptcdata:sf:130582|libiptcdata
libjingle:httpls:http://farsight.freedesktop.org/releases/obsolete/libjingle/
libksba:ftpls:ftp://ftp.gnupg.org/gcrypt/libksba/
liblbxutil:httpls:http://xorg.freedesktop.org/releases/individual/lib/
liblouis:google:liblouis
libmatroska:httpls:http://dl.matroska.org/downloads/libmatroska/
libmbim:httpls:http://www.freedesktop.org/software/libmbim/
libmcs:httpls:http://distfiles.atheme.org/
libmnl:httpls:http://ftp.netfilter.org/pub/libmnl/
libmodman:google:libmodman
libmowgli:httpls:http://distfiles.atheme.org/
libnetfilter_conntrack:httpls:http://ftp.netfilter.org/pub/libnetfilter_conntrack/
libnetfilter_log:httpls:http://ftp.netfilter.org/pub/libnetfilter_log/
libnetfilter_queue:httpls:http://ftp.netfilter.org/pub/libnetfilter_queue/
libnfnetlink:httpls:http://ftp.netfilter.org/pub/libnfnetlink/
libnice:httpls:http://nice.freedesktop.org/releases/
libnjb:sf:32528|libnjb
libnl:httpls:http://www.infradead.org/~tgr/libnl/
libofetion:google:ofetion|libofetion
libofx:sf:61170|libofx
liboil:httpls:http://liboil.freedesktop.org/download/
liboldX:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libopenraw:httpls:http://libopenraw.freedesktop.org/download/
libosinfo:httpls:https://fedorahosted.org/releases/l/i/libosinfo/
libpciaccess:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libpinyin:httpls:http://github.com/libpinyin/libpinyin/downloads
libplist:httpls:http://github.com/JonathanBeck/libplist/downloads
libproxy:google:libproxy
libpst:httpls:http://www.five-ten-sg.com/libpst/packages/
libpwquality:httpls:https://fedorahosted.org/releases/l/i/libpwquality/
libqmi:httpls:http://www.freedesktop.org/software/libqmi/
librep:httpls:http://download.tuxfamily.org/librep/
librime:google:rimeime|librime
libsexy:httpls:http://releases.chipx86.com/libsexy/libsexy/
libspectre:httpls:http://libspectre.freedesktop.org/releases/
libssui:google:libssui
libtasn1:httpls:http://ftp.gnu.org/gnu/libtasn1/
libtelepathy:httpls:http://telepathy.freedesktop.org/releases/libtelepathy/
libthai:httpls:http://linux.thai.net/pub/thailinux/software/libthai/
libturpial:httpls:http://files.turpial.org.ve/sources/stable/
libvirt-cim:httpls:http://libvirt.org/sources/CIM/
libvirt-glib:httpls:http://libvirt.org/sources/glib/
libvirt:httpls:http://libvirt.org/sources/
libvpx:google:webm|libvpx
libwacom:sf:69596|libwacom
libwebp:google:webp|libwebp
libxcb:httpls:http://xorg.freedesktop.org/releases/individual/xcb/
libxkbcommon:httpls:http://xkbcommon.org/download/
libxkbfile:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libxkbui:httpls:http://xorg.freedesktop.org/releases/individual/lib/
libxklavier:sf:319|libxklavier
libzeitgeist:lp:libzeitgeist
liferea:httpls:https://github.com/lwindolf/liferea/releases/
link-grammar:subdirhttpls:http://www.abisource.com/downloads/link-grammar/
listres:httpls:http://xorg.freedesktop.org/releases/individual/app/
lmms:sf:105168
lndir:httpls:http://xorg.freedesktop.org/releases/individual/util/
m4:httpls:http://ftp.gnu.org/gnu/m4/
m17n-contrib:httpls:http://download.savannah.gnu.org/releases/m17n/
m17n-db:httpls:http://download.savannah.gnu.org/releases/m17n/
m17n-lib:httpls:http://download.savannah.gnu.org/releases/m17n/
mail-notification:httpls:http://www.nongnu.org/mailnotify/
mangler:httpls:http://www.mangler.org/downloads/
makedepend:httpls:http://xorg.freedesktop.org/releases/individual/util/
md5deep:sf:67079|md5deep
media-explorer:httpls:https://github.com/media-explorer/media-explorer/downloads
media-player-info:httpls:http://www.freedesktop.org/software/media-player-info/
mercurial:httpls:https://www.mercurial-scm.org/release/
mkcomposecache:httpls:http://xorg.freedesktop.org/releases/individual/app/
mkfontdir:httpls:http://xorg.freedesktop.org/releases/individual/app/
mkfontscale:httpls:http://xorg.freedesktop.org/releases/individual/app/
mkvtoolnix:httpls:http://www.bunkus.org/videotools/mkvtoolnix/sources/
moc:httpls:http://moc.daper.net/download
mod_dnssd:httpls:http://0pointer.de/lennart/projects/mod_dnssd/
monsoon:httpls:http://www.monsoon-project.org/jaws/index.php?page/Download
mozilla-bonobo:httpls:http://download.savannah.gnu.org/releases/moz-bonobo/
mbpurple:google:microblog-purple|mbpurple
mrim-prpl:google:mrim-prpl|mrim-prpl
mtr:ftpls:ftp://ftp.bitwizard.nl/mtr/
mx:httpls:https://github.com/clutter-project/mx/downloads
nautilus-search-tool:sf:149158|nautilus-search-tool
nautilus-terminal:lp:nautilus-terminal
neon:httpls:http://www.webdav.org/neon/
net6:httpls:http://releases.0x539.de/net6/
netspeed_applet:lp:netspeed
nimbus:httpls:http://dlc.sun.com/osol/jds/downloads/extras/nimbus/
njb-sharp:httpls:http://download.banshee.fm/legacy/njb-sharp/
notify-osd:lp:notify-osd
notify-python:httpls:http://www.galago-project.org/files/releases/source/notify-python/
npth:ftpls:ftp://ftp.gnupg.org/gcrypt/npth/
nspluginwrapper:httpls:http://nspluginwrapper.org/download/
nss-mdns:httpls:http://0pointer.de/lennart/projects/nss-mdns/
ntfs-3g_ntfsprogs:httpls:http://www.tuxera.com/community/ntfs-3g-download/
ntfs-config:httpls:http://flomertens.free.fr/ntfs-config/download.html
obby:httpls:http://releases.0x539.de/obby/
obex-data-server:httpls:http://tadas.dailyda.com/software/
oclock:httpls:http://xorg.freedesktop.org/releases/individual/app/
onboard:lp:onboard
opencc:google:opencc
openfetion:google:ofetion|openfetion
openobex:httpls:http://www.kernel.org/pub/linux/bluetooth/
opus:httpls:http://downloads.xiph.org/releases/opus/
orc:httpls:http://code.entropywave.com/download/orc/
ori:httpls:https://bitbucket.org/orifs/ori/downloads/
osm-gps-map:httpls:https://github.com/nzjrs/osm-gps-map/releases/
p11-kit:httpls:http://p11-glue.freedesktop.org/releases/
padevchooser:httpls:http://0pointer.de/lennart/projects/padevchooser/
pam_mount:sf:41452
paman:httpls:http://0pointer.de/lennart/projects/paman/
paprefs:httpls:http://freedesktop.org/software/pulseaudio/paprefs/
papyon:httpls:http://www.freedesktop.org/software/papyon/releases/
pavucontrol:httpls:http://freedesktop.org/software/pulseaudio/pavucontrol/
pavuk:sf:81012|pavuk
pavumeter:httpls:http://0pointer.de/lennart/projects/pavumeter/
pcre:ftpls:ftp://ftp.csx.cam.ac.uk/pub/software/programming/pcre/
pidgin-advanced-sound-notification:lp:pidgin-advanced-sound-notification
pidgin-birthday-reminder:lp:pidgin-birthday-reminder
pidgin-embeddedvideo:google:pidgin-embeddedvideo|pidgin-embeddedvideo
pidgin-facebookchat:google:pidgin-facebookchat|pidgin-facebookchat-source
pidgin-guifications:httpls:https://www.guifications.org/projects/gf2/files
pidgin-openfetion:google:ofetion|pidgin-openfetion
pidgin-otr:httpls:http://www.cypherpunks.ca/otr/
pidgin-sipe:sf:194563|sipe
pidgin:sf:235|Pidgin
pinentry:ftpls:ftp://ftp.gnupg.org/gcrypt/pinentry/
pino:google:pino-twitter|pino
pithos:httpls:http://kevinmehall.net/p/pithos/release/
pixman:dualhttpls:http://cairographics.org/snapshots/|http://cairographics.org/releases/
pkg-config:httpls:http://pkgconfig.freedesktop.org/releases/
polkit-gnome:httpls:http://hal.freedesktop.org/releases/
polkit:httpls:http://www.freedesktop.org/software/polkit/releases/
poppler-data:httpls:http://poppler.freedesktop.org/
poppler:httpls:http://poppler.freedesktop.org/releases.html
posixovl:sf:255236
powerpc-utils:sf:261744
ppc64-diag:sf:44427
presage:sf:172950|presage
procmeter3:httpls:http://www.gedanken.demon.co.uk/download-procmeter/
proxymngr:httpls:http://xorg.freedesktop.org/releases/individual/app/
psiconv:httpls:http://software.frodo.looijaard.name/psiconv/download.php
psmisc:sf:15273|psmisc
pulseaudio:httpls:http://www.freedesktop.org/software/pulseaudio/releases/
purple-plugin-pack:httpls:https://www.guifications.org/projects/purple-plugin-pack/files
py2cairo:dualhttpls:http://cairographics.org/snapshots/|http://cairographics.org/releases/
pycairo:dualhttpls:http://cairographics.org/snapshots/|http://cairographics.org/releases/
pycups:httpls:http://cyberelk.net/tim/data/pycups/
pygtkglext:sf:54333|pygtkglext
pymetar:httpls:http://www.schwarzvogel.de/pkgs/
pymsn:httpls:http://telepathy.freedesktop.org/releases/pymsn/
pysmbc:httpls:http://cyberelk.net/tim/data/pysmbc/
python-distutils-extra:lp:python-distutils-extra
python-espeak:lp:python-espeak
virtkey:lp:virtkey
python-xlib:sf:10350|python-xlib
pywebkitgtk:google:pywebkitgtk
pyxdg:httpls:http://www.freedesktop.org/wiki/Software/pyxdg
qiv:httpls:http://spiegl.de/qiv/download/
qmmp:httpls:http://qmmp.ylsoftware.com/files/
quilt:httpls:http://download.savannah.gnu.org/releases/quilt/
radiotray:sf:295096
raptor:httpls:http://download.librdf.org/source/
rarian:httpls:http://rarian.freedesktop.org/Releases/
rasqal:httpls:http://download.librdf.org/source/
raw-thumbnailer:httpls:http://libopenraw.freedesktop.org/download/
recordmydesktop:sf:172357|recordmydesktop
redland:httpls:http://download.librdf.org/source/
rednotebook:sf:238077
remmina:httpls:https://github.com/FreeRDP/Remmina/releases
rendercheck:httpls:http://xorg.freedesktop.org/releases/individual/app/
rep-gtk:httpls:http://download.tuxfamily.org/librep/rep-gtk/
rgb:httpls:http://xorg.freedesktop.org/releases/individual/app/
rlwrap:httpls:http://utopia.knoware.nl/~hlub/rlwrap/
rstart:httpls:http://xorg.freedesktop.org/releases/individual/app/
sawfish:httpls:http://download.tuxfamily.org/sawfish/
schismtracker:httpls:http://schismtracker.org/dl/
schroedinger:httpls:http://diracvideo.org/download/schroedinger/
scim:sf:108454|scim
scim-anthy:sf_jp:scim-imengine
scim-bridge:sf:108454|scim-bridge
scim-canna:sf_jp:scim-imengine
scim-chewing:google:chewing|scim-chewing
scim-hangul:sf:108454|scim-hangul
scim-input-pad:sf:108454|scim-input-pad
scim-m17n:sf:108454|scim-m17n
scim-pinyin:sf:108454|scim-pinyin
scim-qtimm:sf:108454|scim-qtimm
scim-skk:sf_jp:scim-imengine
scim-sunpinyin:google:sunpinyin|scim-sunpinyin
scim-tables:sf:108454|scim-tables
scim-tomoe:sf_jp:scim-imengine
scim-uim:sf:108454|scim-uim
scim-unikey:google:scim-unikey
# Use "|xorg" because scripts is a really generic name
scripts|xorg:httpls:http://xorg.freedesktop.org/releases/individual/app/
scummvm:sf:37116
sessreg:httpls:http://xorg.freedesktop.org/releases/individual/app/
setxkbmap:httpls:http://xorg.freedesktop.org/releases/individual/app/
shared-color-profiles:httpls:http://people.freedesktop.org/~hughsient/releases/
shared-color-targets:httpls:http://people.freedesktop.org/~hughsient/releases/
shared-desktop-ontologies:sf:254113|shared-desktop-ontologies
shared-mime-info:httpls:http://people.freedesktop.org/~hadess/
shotwell:subdirhttpls:http://yorba.org/download/shotwell/
showfont:httpls:http://xorg.freedesktop.org/releases/individual/app/
simple-ccsm:subdirhttpls:http://releases.compiz.org/components/simple-ccsm/
simple-scan:lp:simple-scan
smproxy:httpls:http://xorg.freedesktop.org/releases/individual/app/
smuxi:httpls:http://www.smuxi.org/jaws/data/files/
sobby:httpls:http://releases.0x539.de/sobby/
sofia-sip:sf:143636|sofia-sip
solang:httpls:http://projects.gnome.org/solang/download
sound-theme-freedesktop:dualhttpls:http://people.freedesktop.org/~mccann/dist/|http://www.freedesktop.org/wiki/Specifications/sound-theme-spec
sparkleshare:httpls:http://sparkleshare.org/
specto:google:specto
speech-dispatcher:httpls:http://www.freebsoft.org/pub/projects/speechd/
spheres-and-crystals:sf:64400|spherecrystal
spice-gtk:httpls:http://spice-space.org/download/gtk/
spice-protocol:httpls:http://spice-space.org/download/releases/
spice:httpls:http://spice-space.org/download/releases/
ssh-contact:httpls:http://telepathy.freedesktop.org/releases/ssh-contact/
sshfp:ftpls:ftp://ftp.xelerance.com/sshfp/
startup-notification:httpls:http://www.freedesktop.org/software/startup-notification/releases/
stk:httpls:https://ccrma.stanford.edu/software/stk/download.html
sunpinyin:google:sunpinyin
swfdec-mozilla:subdirhttpls:http://swfdec.freedesktop.org/download/swfdec-mozilla/
swfdec:subdirhttpls:http://swfdec.freedesktop.org/download/swfdec/
swig:sf:1645
synapse:lp:synapse-project
system-config-printer:subdirhttpls:http://cyberelk.net/tim/data/system-config-printer/
tangerine:lp:tangerine
tango-icon-theme:httpls:http://tango.freedesktop.org/releases/
telepathy-butterfly:httpls:http://telepathy.freedesktop.org/releases/telepathy-butterfly/
telepathy-farsight:httpls:http://telepathy.freedesktop.org/releases/telepathy-farsight/
telepathy-farstream:httpls:http://telepathy.freedesktop.org/releases/telepathy-farstream/
telepathy-gabble:httpls:http://telepathy.freedesktop.org/releases/telepathy-gabble/
telepathy-glib:httpls:http://telepathy.freedesktop.org/releases/telepathy-glib/
telepathy-haze:httpls:http://telepathy.freedesktop.org/releases/telepathy-haze/
telepathy-idle:httpls:http://telepathy.freedesktop.org/releases/telepathy-idle/
telepathy-logger:httpls:http://telepathy.freedesktop.org/releases/telepathy-logger/
# telepathy-mission-control: for the old 4.x branch: telepathy-mission-control:sf:190214|mission-control
telepathy-mission-control:httpls:http://telepathy.freedesktop.org/releases/telepathy-mission-control/
telepathy-python:httpls:http://telepathy.freedesktop.org/releases/telepathy-python/
telepathy-rakia:httpls:http://telepathy.freedesktop.org/releases/telepathy-rakia/
telepathy-salut:httpls:http://telepathy.freedesktop.org/releases/telepathy-salut/
# telepathy-sofiasip: used to live at: telepathy-sofiasip:sf:191149
telepathy-sofiasip:httpls:http://telepathy.freedesktop.org/releases/telepathy-sofiasip/
telepathy-stream-engine:httpls:http://telepathy.freedesktop.org/releases/stream-engine/
tig:httpls:http://jonas.nitro.dk/tig/releases/
tilda:sf:126081|tilda
tinyproxy:httpls:https://banu.com/tinyproxy/
tokyocabinet:httpls:http://fallabs.com/tokyocabinet/
traffic-vis:httpls:http://www.mindrot.org/traffic-vis.html
transmageddon:httpls:http://www.linuxrising.org/files/
transmission:httpls:http://download.m0k.org/transmission/files/
tsclient:sf:192483|tsclient
turpial:httpls:http://turpial.org.ve/files/sources/stable/
twitux:sf:198704|twitux
twm:httpls:http://xorg.freedesktop.org/releases/individual/app/
udisks:httpls:http://udisks.freedesktop.org/releases/
udisks|1.90:dualhttpls:http://hal.freedesktop.org/releases/|http://udisks.freedesktop.org/releases/
uget:sf:72252|Uget (stable)
uhttpmock:httpls:https://tecnocode.co.uk/downloads/uhttpmock/
ulogd:httpls:http://ftp.netfilter.org/pub/ulogd/
unico:lp:unico
upower:httpls:http://upower.freedesktop.org/releases/
usbredir:httpls:http://spice-space.org/download/usbredir/
usbview:httpls:http://www.kroah.com/linux-usb/
valencia:subdirhttpls:http://yorba.org/download/valencia/
varnish:httpls:http://repo.varnish-cache.org/source/
# vboxgtk: used to live at: vboxgtk:sf:263334|vboxgtk
vboxgtk:google:vboxgtk
viewres:httpls:http://xorg.freedesktop.org/releases/individual/app/
vim:ftpls:ftp://ftp.vim.org/pub/vim/unix/
vmfs-tools:httpls:http://glandium.org/projects/vmfs-tools/
vobject:httpls:http://vobject.skyhouseconsulting.com/
wadptr:httpls:http://soulsphere.org/projects/wadptr/
weather-wallpaper:httpls:http://mundogeek.net/weather-wallpaper/
webkitgtk|2.4:httpls:http://webkitgtk.org/releases/
webkitgtk:httpls:http://webkitgtk.org/releases/
wmakerconf:sf:196469|wmakerconf
x-tile:httpls:http://www.giuspen.com/software/
x11perf:httpls:http://xorg.freedesktop.org/releases/individual/app/
x11vnc:sf:32584|x11vnc
xauth:httpls:http://xorg.freedesktop.org/releases/individual/app/
xbacklight:httpls:http://xorg.freedesktop.org/releases/individual/app/
xbiff:httpls:http://xorg.freedesktop.org/releases/individual/app/
xbitmaps:httpls:http://xorg.freedesktop.org/releases/individual/data/
xcalc:httpls:http://xorg.freedesktop.org/releases/individual/app/
xcb-util-image:httpls:http://xorg.freedesktop.org/releases/individual/xcb/
xcb-util-keysyms:httpls:http://xorg.freedesktop.org/releases/individual/xcb/
xcb-util-renderutil:httpls:http://xorg.freedesktop.org/releases/individual/xcb/
xcb-util-wm:httpls:http://xorg.freedesktop.org/releases/individual/xcb/
xcb-util:httpls:http://xorg.freedesktop.org/releases/individual/xcb/
xchat:subdirhttpls:http://www.xchat.org/files/source/
xclipboard:httpls:http://xorg.freedesktop.org/releases/individual/app/
xclock:httpls:http://xorg.freedesktop.org/releases/individual/app/
xcmsdb:httpls:http://xorg.freedesktop.org/releases/individual/app/
xcompmgr:httpls:http://xorg.freedesktop.org/releases/individual/app/
xconsole:httpls:http://xorg.freedesktop.org/releases/individual/app/
xcursorgen:httpls:http://xorg.freedesktop.org/releases/individual/app/
xcursor-themes:httpls:http://xorg.freedesktop.org/releases/individual/data/
xdbedizzy:httpls:http://xorg.freedesktop.org/releases/individual/app/
xdg-user-dirs:httpls:http://user-dirs.freedesktop.org/releases/
xdg-utils:httpls:http://portland.freedesktop.org/download/
xditview:httpls:http://xorg.freedesktop.org/releases/individual/app/
xdm:httpls:http://xorg.freedesktop.org/releases/individual/app/
xdpyinfo:httpls:http://xorg.freedesktop.org/releases/individual/app/
xedit:httpls:http://xorg.freedesktop.org/releases/individual/app/
xev:httpls:http://xorg.freedesktop.org/releases/individual/app/
xeyes:httpls:http://xorg.freedesktop.org/releases/individual/app/
xf86dga:httpls:http://xorg.freedesktop.org/releases/individual/app/
xf86-input-evdev:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-input-joystick:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-input-keyboard:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-input-libinput:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-input-mouse:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-input-synaptics:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-input-vmmouse:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-input-void:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-input-wacom:sf:69596|xf86-input-wacom
xf86-video-ark:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-ast:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-ati:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-cirrus:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-dummy:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-fbdev:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-geode:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-glint:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-i128:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-intel:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-ivtv:httpls:http://dl.ivtvdriver.org/xf86-video-ivtv/
xf86-video-mach64:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-mga:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-neomagic:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-newport:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-nv:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-qxl:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-r128:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-radeonhd:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-savage:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-siliconmotion:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-sis:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-tdfx:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-tga:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-trident:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-v4l:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-vesa:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-vmware:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xf86-video-voodoo:httpls:http://xorg.freedesktop.org/releases/individual/driver/
xfd:httpls:http://xorg.freedesktop.org/releases/individual/app/
xfindproxy:httpls:http://xorg.freedesktop.org/releases/individual/app/
xfontsel:httpls:http://xorg.freedesktop.org/releases/individual/app/
xfs:httpls:http://xorg.freedesktop.org/releases/individual/app/
xfsdump:ftpls:ftp://oss.sgi.com/projects/xfs/cmd_tars/
xfsinfo:httpls:http://xorg.freedesktop.org/releases/individual/app/
xfsprogs:ftpls:ftp://oss.sgi.com/projects/xfs/cmd_tars/
xfwp:httpls:http://xorg.freedesktop.org/releases/individual/app/
xgamma:httpls:http://xorg.freedesktop.org/releases/individual/app/
xgc:httpls:http://xorg.freedesktop.org/releases/individual/app/
xhost:httpls:http://xorg.freedesktop.org/releases/individual/app/
xinit:httpls:http://xorg.freedesktop.org/releases/individual/app/
xinput:httpls:http://xorg.freedesktop.org/releases/individual/app/
xkbcomp:httpls:http://xorg.freedesktop.org/releases/individual/app/
xkbevd:httpls:http://xorg.freedesktop.org/releases/individual/app/
xkbprint:httpls:http://xorg.freedesktop.org/releases/individual/app/
xkbutils:httpls:http://xorg.freedesktop.org/releases/individual/app/
xkeyboard-config:httpls:http://xorg.freedesktop.org/releases/individual/data/
xkill:httpls:http://xorg.freedesktop.org/releases/individual/app/
xload:httpls:http://xorg.freedesktop.org/releases/individual/app/
xlogo:httpls:http://xorg.freedesktop.org/releases/individual/app/
xlsatoms:httpls:http://xorg.freedesktop.org/releases/individual/app/
xlsclients:httpls:http://xorg.freedesktop.org/releases/individual/app/
xlsfonts:httpls:http://xorg.freedesktop.org/releases/individual/app/
xmag:httpls:http://xorg.freedesktop.org/releases/individual/app/
xman:httpls:http://xorg.freedesktop.org/releases/individual/app/
xmessage:httpls:http://xorg.freedesktop.org/releases/individual/app/
xmh:httpls:http://xorg.freedesktop.org/releases/individual/app/
xmodmap:httpls:http://xorg.freedesktop.org/releases/individual/app/
xorgxrdp:httpls:https://github.com/neutrinolabs/xorgxrdp/releases
xmore:httpls:http://xorg.freedesktop.org/releases/individual/app/
xorg-cf-files:httpls:http://xorg.freedesktop.org/releases/individual/util/
xorg-docs:httpls:http://xorg.freedesktop.org/releases/individual/doc/
xorg-sgml-doctools:httpls:http://xorg.freedesktop.org/releases/individual/doc/
xosd:sf:124390|libxosd
xplsprinters:httpls:http://xorg.freedesktop.org/releases/individual/app/
xprehashprinterlist:httpls:http://xorg.freedesktop.org/releases/individual/app/
xpr:httpls:http://xorg.freedesktop.org/releases/individual/app/
xprop:httpls:http://xorg.freedesktop.org/releases/individual/app/
xrandr:httpls:http://xorg.freedesktop.org/releases/individual/app/
xrdb:httpls:http://xorg.freedesktop.org/releases/individual/app/
xrdp:httpls:https://github.com/neutrinolabs/xrdp/releases
xrefresh:httpls:http://xorg.freedesktop.org/releases/individual/app/
xrestop:httpls:http://downloads.yoctoproject.org/releases/xrestop/
xrx:httpls:http://xorg.freedesktop.org/releases/individual/app/
xsane:httpls:http://www.xsane.org/cgi-bin/sitexplorer.cgi?/download/
xscope:httpls:http://xorg.freedesktop.org/releases/individual/app/
xset:httpls:http://xorg.freedesktop.org/releases/individual/app/
xsetmode:httpls:http://xorg.freedesktop.org/releases/individual/app/
xsetpointer:httpls:http://xorg.freedesktop.org/releases/individual/app/
xsetroot:httpls:http://xorg.freedesktop.org/releases/individual/app/
xsm:httpls:http://xorg.freedesktop.org/releases/individual/app/
xstdcmap:httpls:http://xorg.freedesktop.org/releases/individual/app/
xtables-addons:sf:254159
xtrans:httpls:http://xorg.freedesktop.org/releases/individual/lib/
xtrap:httpls:http://xorg.freedesktop.org/releases/individual/app/
xvidtune:httpls:http://xorg.freedesktop.org/releases/individual/app/
xvinfo:httpls:http://xorg.freedesktop.org/releases/individual/app/
xwd:httpls:http://xorg.freedesktop.org/releases/individual/app/
xwininfo:httpls:http://xorg.freedesktop.org/releases/individual/app/
xwud:httpls:http://xorg.freedesktop.org/releases/individual/app/
xzgv:sf:203093|xzgv
yaml-cpp:google:yaml-cpp
zeitgeist-datahub:lp:zeitgeist-datahub
zeitgeist:lp:zeitgeist
zim:httpls:http://zim-wiki.org/downloads/
zlib:httpls:http://www.zlib.net/

# Moved to ftp.gnome.org
#byzanz:httpls:http://people.freedesktop.org/~company/byzanz/

## We are upstream, with no tarball
beagle-index:upstream:
build-compare:upstream:
bundle-lang-common:upstream:
bundle-lang-gnome-extras:upstream:
bundle-lang-gnome-extras:upstream:
bundle-lang-gnome:upstream:
bundle-lang-gnome:upstream:
bundle-lang-kde:upstream:
bundle-lang-other:upstream:
desktop-data-SLED:upstream:
desktop-data-openSUSE:upstream:
desktop-translations:upstream:
dynamic-wallpapers-11x:upstream:
ggreeter:upstream:
gnome-patch-translation:upstream:
gnome-shell-search-provider-openSUSE-packages:upstream:
gos-wallpapers:upstream:
gtk2-themes:upstream:
libsolv:upstream:
libzypp:upstream:
libzypp-bindings:upstream:
libzypp-testsuite-tools:upstream:
metacity-themes:upstream:
opt_gnome-compat:upstream:
tennebon-dynamic-wallpaper:upstream:
translation-update:upstream:
update-desktop-files:upstream:
yast2:upstream:
yast2-control-center-gnome:upstream:
zypp-plugin:upstream:
zypper:upstream:

##
## Upstream where our script doesn't work (with potential workarounds)
##
# gimp-gap: not good: we have the version in the URL...
gimp-gap:ftpls:ftp://ftp.gimp.org/pub/gimp/plug-ins/v2.6/gap/
# iso-codes: Ugly workaround as the ftp is not accessibly from the server :/ Should be: iso-codes:ftpls:ftp://pkg-isocodes.alioth.debian.org/pub/pkg-isocodes/
iso-codes:httpls:http://translationproject.org/domain/iso_4217.html
# memphis: trac doesn't work. Should be: memphis:trac:http://trac.openstreetmap.ch/trac/memphis/downloads
memphis:httpls:http://trac.openstreetmap.ch/trac/memphis/
# pypoppler: unfortunately, it doesn't work: the rdf has no information, so we use httpls instead for now: pypoppler:lp:poppler-python
pypoppler:httpls:https://launchpad.net/poppler-python/+download
# xdelta: tarball names are not standard
xdelta:google:xdelta|xdelta
#FIXME opal http://sourceforge.net/project/showfiles.php?group_id=204472
#FIXME ptlib http://sourceforge.net/project/showfiles.php?group_id=204472

##
## Upstream where the old page is dead
##
## QtCurve-Gtk3: Doesn't exist anymore?
#QtCurve-Gtk3:httpls:http://www.kde-look.org/content/download.php?content=40492&id=4
## dopi is dead: no upstream homepage anymore
#dopi:httpls:http://www.snorp.net/files/dopi/
## Server is down
#metatheme-Sonar:httpls:http://forgeftp.novell.com/opensuse-art/openSUSE11.2/metatheme/
#metatheme-gilouche:httpls:http://forgeftp.novell.com/opensuse-art/openSUSE11.1/metatheme/
#opensuse-font-fifth-leg:httpls:http://forgeftp.novell.com/opensuse-art/openSUSE11.2/fonts/

##
## Packages that we removed
##
## clutter-cairo: also has no real homepage anymore
# clutter-cairo:subdirhttpls:http://clutter-project.org/sources/clutter-cairo/
# gst-pulse:httpls:http://0pointer.de/lennart/projects/gst-pulse/
## last-exist is dead: no upstream homepage anymore
# last-exit:httpls:http://lastexit-player.org/releases/
# libsvg:dualhttpls:http://cairographics.org/snapshots/|http://cairographics.org/releases/
# libsvg-cairo:dualhttpls:http://cairographics.org/snapshots/|http://cairographics.org/releases/
# pysqlite:subdirhttpls:http://oss.itsystementwicklung.de/download/pysqlite/
070701000000340000A1FF000000000000000000000001654A0F6A00000011000000000000000000000000000000000000003300000000osc-plugin-collab-0.104+30/server/upstream/util.py../obs-db/util.py07070100000035000041ED0000000000000000000000026548EB8C00000000000000000000000000000000000000000000002600000000osc-plugin-collab-0.104+30/server/web07070100000036000081ED0000000000000000000000016548EB8C00005D1A000000000000000000000000000000000000002D00000000osc-plugin-collab-0.104+30/server/web/api.py#!/usr/bin/env python
# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2008-2010, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os
import sys

import cStringIO
import gzip
import re
import sqlite3

import cgi

try:
    from lxml import etree as ET
except ImportError:
    try:
        from xml.etree import cElementTree as ET
    except ImportError:
        import cElementTree as ET

from libdissector import config
from libdissector import libdbcore
from libdissector import libinfoxml

if config.cgitb:
    import cgitb; cgitb.enable()

# Database containing metadata. Can be created if needed.
METADATA_DBFILE = os.path.join(config.datadir, 'metadata.db')

# Protocol version is X.Y.
#  + when breaking compatibility in the XML, then increase X and reset Y to 0.
#  + when adding a new feature in a compatible way, increase Y.
PROTOCOL_MAJOR = 0
PROTOCOL_MINOR = 2

#######################################################################


class ApiOutput:

    def __init__(self):
        self.root = ET.Element('api')
        self.root.set('version', '%d.%d' % (PROTOCOL_MAJOR, PROTOCOL_MINOR))
        self.result = None
        self.compress = False

    def set_compress(self, compress):
        can_compress = False
        if os.environ.has_key('HTTP_ACCEPT_ENCODING'):
            accepted = os.environ['HTTP_ACCEPT_ENCODING'].split(',')
            accepted = [ item.strip() for item in accepted ]
            can_compress = 'gzip' in accepted

        self.compress = can_compress and compress

    def set_result(self, ok = True, detail = ''):
        if self.result is None:
            self.result = ET.SubElement(self.root, 'result')

        if ok:
            self.result.set('ok', 'true')
        else:
            self.result.set('ok', 'false')
        if detail:
            self.result.text = detail

    def add_node(self, node):
        self.root.append(node)

    def _output(self):
        print 'Content-type: text/xml'
        print

        ET.ElementTree(self.root).write(sys.stdout)

    def _output_compressed(self):
        # Thanks to http://www.xhaus.com/alan/python/httpcomp.html
        zbuf = cStringIO.StringIO()
        zfile = gzip.GzipFile(mode = 'wb', fileobj = zbuf)
        ET.ElementTree(self.root).write(zfile)
        zfile.close()
        compressed = zbuf.getvalue()

        print 'Content-type: text/xml'
        print 'Content-Encoding: gzip'
        print 'Content-Length: %d' % len(compressed)
        print

        sys.stdout.write(compressed)

    def output(self):
        if self.compress and False:
            self._output_compressed()
        else:
            self._output()


#######################################################################


class ApiGeneric:

    def __init__(self, output, protocol, args, form):
        self.output = output
        self.protocol = protocol
        self.args = args
        self.form = form
        self.db = libdbcore.ObsDb()

    def __del__(self):
        del self.db

    def _find_project_for_package(self, package, projects):
        '''
            Find the first project in the list of projects containing the
            specified package.
        '''
        query = 'SELECT COUNT(*) FROM %s, %s WHERE %s.name = ? AND %s.name = ? AND %s.project = %s.id;' % (libdbcore.table_project, libdbcore.table_srcpackage, libdbcore.table_project, libdbcore.table_srcpackage, libdbcore.table_srcpackage, libdbcore.table_project)
        for project in projects:
            self.db.cursor.execute(query, (project, package))
            row = self.db.cursor.fetchone()
            if row[0] != 0:
                return project

        return None

    def _package_exists(self, project, package):
        '''
            Checks a package exists.
        '''
        query = 'SELECT version FROM %s, %s WHERE %s.name = ? AND %s.name = ? AND %s.project = %s.id;' % (libdbcore.table_project, libdbcore.table_srcpackage, libdbcore.table_project, libdbcore.table_srcpackage, libdbcore.table_srcpackage, libdbcore.table_project)
        self.db.cursor.execute(query, (project, package))
        row = self.db.cursor.fetchone()
        return row is not None

    def _find_devel_package(self, project, package):
        '''
            Find the end devel package of a specified package.
        '''
        query = 'SELECT devel_project, devel_package FROM %s, %s WHERE %s.name = ? AND %s.name = ? AND %s.project = %s.id;' % (libdbcore.table_project, libdbcore.table_srcpackage, libdbcore.table_project, libdbcore.table_srcpackage, libdbcore.table_srcpackage, libdbcore.table_project)
        while True:
            self.db.cursor.execute(query, (project, package))
            row = self.db.cursor.fetchone()
            if not row:
                return (project, package)

            devel_project = row['devel_project']
            devel_package = row['devel_package']

            if not devel_project:
                return (project, package)

            project = devel_project
            package = devel_package or package

        return (project, package)

    def _parse_standard_args(self, paths):
        '''
            Parse a path that is in the form of either of the following:
              + <project>
              + <project>/<package>
              + <package>?project=aa&project=...
        '''
        if len(paths) == 1 and not paths[0]:
            return (True, None, None)
        elif len(paths) == 1 or (len(paths) == 2 and not paths[1]):
            projects = self.form.getlist('project')
            if projects:
                package = paths[0]
                project = self._find_project_for_package(package, projects)
                if project:
                    return (True, project, package)
                else:
                    self.output.set_result(False, 'Non existing package: %s' % package)
                    return (False, None, None)
            else:
                project = paths[0]
                return (True, project, None)
        else:
            project = paths[0]
            package = paths[1]
            return (True, project, package)

    def run(self):
        pass


#######################################################################


class ApiInfo(ApiGeneric):
    '''
        api/info
        api/info/<project>
        api/info/<project>/<package>

        api/info/<package>?project=aa&project=...
    '''

    def _list_projects(self):
        self.db.cursor.execute('''SELECT name FROM %s ORDER BY name;''' % libdbcore.table_project)
        for row in self.db.cursor:
            node = ET.Element('project')
            node.set('name', row['name'])
            self.output.add_node(node)

    def _list_project(self, project):
        info = libinfoxml.InfoXml(self.db)
        try:
            node = info.get_project_node(project)
            self.output.add_node(node)
            output.set_compress(True)
        except libinfoxml.InfoXmlException, e:
            self.output.set_result(False, e.msg)

    def _list_package(self, project, package):
        info = libinfoxml.InfoXml(self.db)
        try:
            prj_node = info.get_project_node(project, False)
            pkg_node = info.get_package_node(project, package)
            prj_node.append(pkg_node)
            self.output.add_node(prj_node)
        except libinfoxml.InfoXmlException, e:
            self.output.set_result(False, e.msg)

    def run(self):
        paths = self.args.split('/')
        if len(paths) > 2:
            self.output.set_result(False, 'Too many arguments to "info" command')
            return

        (ok, project, package) = self._parse_standard_args(paths)
        if not ok:
            return

        if not project:
            self._list_projects()
        elif not package:
            self._list_project(project)
        else:
            self._list_package(project, package)


#######################################################################


class ApiPackageMetadata(ApiGeneric):
    '''
        api/<meta> (list)
        api/<meta>/<project> (list)
        api/<meta>/<project>/<package> (list)
        api/<meta>/<project>/<package>?cmd=list
        api/<meta>/<project>/<package>?cmd=set&user=<user>
        api/<meta>/<project>/<package>?cmd=unset&user=<user>

        api/<meta>?project=aa&project=... (list)
        api/<meta>/<package>?project=aa&project=...&cmd=...

        For all package-related commands, ignoredevel=1 or ignoredevel=true can
        be used to not make the metadata request work on the development
        package of the package, but to force the commands on this package in
        this project.

        Subclasses should:
          - set self.dbtable and self.command in __init__
          - override self._create_node()
          - override self._run_project_package_helper()
    '''

    def __init__(self, output, protocol, args, form):
        ApiGeneric.__init__(self, output, protocol, args, form)

        self.dbmeta = None
        self.cursor = None

        # Should be overridden by subclass
        self.dbtable = ''
        self.command = ''

    def __del__(self):
        if self.cursor:
            self.cursor.close()
        if self.dbmeta:
            self.dbmeta.close()

    def _get_metadata_database(self):
        create = True
        if os.path.exists(METADATA_DBFILE):
            create = False
            if not os.access(METADATA_DBFILE, os.W_OK):
                self.output.set_result(False, 'Read-only database')
                return False
        else:
            dirname = os.path.dirname(METADATA_DBFILE)
            if not os.path.exists(dirname):
                os.makedirs(dirname)

        self.dbmeta = sqlite3.connect(METADATA_DBFILE)
        if not self.dbmeta:
            self.output.set_result(False, 'No access to database')
            return False

        self.dbmeta.row_factory = sqlite3.Row
        self.cursor = self.dbmeta.cursor()

        if create:
            # When adding a table here, update _prune_old_metadata() and
            # _check_no_abuse() to deal with them too.
            self.cursor.execute('''CREATE TABLE reserve (date TEXT, user TEXT, project TEXT, package TEXT);''')
            self.cursor.execute('''CREATE TABLE comment (date TEXT, user TEXT, project TEXT, package TEXT, comment TEXT);''')

        return True

    def _prune_old_metadata(self):
        # do not touch comments, since they might stay for good reasons
        self.cursor.execute('''DELETE FROM reserve WHERE datetime(date, '+36 hours') < datetime('now');''')

    def _check_no_abuse(self):
        # just don't do anything if we have more than 200 entries in a table
        # (we're getting spammed)
        for table in [ 'reserve', 'comment' ]:
            self.cursor.execute('''SELECT COUNT(*) FROM %s;''' % table)

            row = self.cursor.fetchone()
            if not row or row[0] > 200:
                self.output.set_result(False, 'Database currently unavailable')
                return False

        return True

    def _create_node(self, row):
        ''' Should be overridden by subclass. '''
        # Note: row can be a sqlite3.Row or a tuple
        return None

    def _list_all(self, projects = None):
        if projects:
            projects_where = ' OR '.join(['project = ?' for project in projects])
            self.cursor.execute('''SELECT * FROM %s WHERE %s ORDER BY project, package;''' % (self.dbtable, projects_where), projects)
        else:
            self.cursor.execute('''SELECT * FROM %s ORDER BY project, package;''' % self.dbtable)

        for row in self.cursor:
            node = self._create_node(row)
            if node is None:
                self.output.set_result(False, 'Internal server error')
                return
            self.output.add_node(node)

    def _run_project_package_helper(self, user, subcommand, project, package):
        ''' Should be overridden by subclass. '''
        return None

    def _run_project_package(self, project, package):
        ignore_devel = False
        if form.has_key('ignoredevel'):
            if form.getfirst('ignoredevel').lower() in [ '1', 'true' ]:
                ignore_devel = True

        if not ignore_devel:
            (project, package) = self._find_devel_package(project, package)

        if not self._package_exists(project, package):
            self.output.set_result(False, 'Non existing package: %s/%s' % (project, package))
            return

        if form.has_key('cmd'):
            cmd = form.getfirst('cmd')
        else:
            cmd = 'list'

        if cmd not in [ 'list', 'set', 'unset' ]:
            self.output.set_result(False, 'Unknown "%s" subcommand: %s' % (self.command, cmd))
            return

        if form.has_key('user'):
            user = form.getfirst('user')
        else:
            user = None

        if cmd in [ 'set', 'unset' ] and not user:
            self.output.set_result(False, 'No user specified')
            return

        pseudorow = self._run_project_package_helper(user, cmd, project, package)

        self.dbmeta.commit()
        node = self._create_node(pseudorow)
        if node is None:
            self.output.set_result(False, 'Internal server error')
            return
        self.output.add_node(node)

    def run(self):
        if not self._get_metadata_database():
            return

        # automatically remove old metadata
        self._prune_old_metadata()

        if not self._check_no_abuse():
            return

        paths = self.args.split('/')
        if len(paths) > 2:
            self.output.set_result(False, 'Too many arguments to "%s" command' % self.command)
            return

        (ok, project, package) = self._parse_standard_args(paths)
        if not ok:
            return

        if not project:
            projects = self.form.getlist('project')
            self._list_all(projects)
        elif not package:
            self._list_all((project,))
        else:
            self._run_project_package(project, package)


#######################################################################


class ApiReserve(ApiPackageMetadata):
    '''
        See ApiPackageMetadata comment, with <meta> == reserve
    '''

    def __init__(self, output, protocol, args, form):
        ApiPackageMetadata.__init__(self, output, protocol, args, form)

        self.dbtable = 'reserve'
        self.command = 'reserve'

    def _create_node(self, row):
        # Note: row can be a sqlite3.Row or a tuple
        keys = row.keys()
        if not ('project' in keys and 'package' in keys):
            return None

        project = row['project']
        package = row['package']
        if 'user' in keys:
            user = row['user']
        else:
            user = None

        node = ET.Element('reservation')
        node.set('project', project)
        node.set('package', package)
        if user:
            node.set('user', user)
        return node

    def _run_project_package_helper(self, user, subcommand, project, package):
        self.cursor.execute('''SELECT user FROM reserve WHERE project = ? AND package = ?;''', (project, package,))
        row = self.cursor.fetchone()
        if row:
            reserved_by = row['user']
        else:
            reserved_by = None

        if subcommand == 'list':
            # we just want the reservation node
            pass
        elif subcommand == 'set':
            if reserved_by:
                self.output.set_result(False, 'Package already reserved by %s' % reserved_by)
            else:
                self.cursor.execute('''INSERT INTO reserve VALUES (datetime('now'), ?, ?, ?);''', (user, project, package))
                reserved_by = user
        elif subcommand == 'unset':
            if not reserved_by:
                self.output.set_result(False, 'Package not reserved')
            elif reserved_by != user:
                self.output.set_result(False, 'Package reserved by %s' % reserved_by)
            else:
                self.cursor.execute('''DELETE FROM reserve WHERE user = ? AND project = ? AND package = ?''', (user, project, package))
                reserved_by = None

        pseudorow = {}
        pseudorow['project'] = project
        pseudorow['package'] = package
        if reserved_by:
            pseudorow['user'] = reserved_by

        return pseudorow


#######################################################################


class ApiComment(ApiPackageMetadata):
    '''
        See ApiPackageMetadata comment, with <meta> == comment
    '''

    def __init__(self, output, protocol, args, form):
        ApiPackageMetadata.__init__(self, output, protocol, args, form)

        self.dbtable = 'comment'
        self.command = 'comment'

    def _create_node(self, row):
        # Note: row can be a sqlite3.Row or a tuple
        keys = row.keys()
        if not ('project' in keys and 'package' in keys):
            return None

        project = row['project']
        package = row['package']
        date = None
        user = None
        comment = None
        if 'date' in keys:
            date = row['date']
        if 'user' in keys:
            user = row['user']
        if 'comment' in keys:
            comment = row['comment']

        node = ET.Element('comment')
        node.set('project', project)
        node.set('package', package)
        if date:
            node.set('date', date)
        if user:
            node.set('user', user)
        if comment:
            node.text = comment
        return node

    def _run_project_package_helper(self, user, subcommand, project, package):
        if form.has_key('comment'):
            form_comment = form.getfirst('comment')
        else:
            form_comment = None

        self.cursor.execute('''SELECT user, comment, date FROM comment WHERE project = ? AND package = ?;''', (project, package,))
        row = self.cursor.fetchone()
        if row:
            commented_by = row['user']
            comment = row['comment']
            date = row['date']
        else:
            commented_by = None
            comment = None
            date = None

        if subcommand == 'list':
            # we just want the comment node
            pass
        elif subcommand == 'set':
            if commented_by:
                self.output.set_result(False, 'Package already commented by %s' % commented_by)
            else:
                if not form_comment:
                    self.output.set_result(False, 'No comment provided')
                elif len(form_comment) > 1000:
                    self.output.set_result(False, 'Provided comment is too long')
                else:
                    self.cursor.execute('''INSERT INTO comment VALUES (datetime('now'), ?, ?, ?, ?);''', (user, project, package, form_comment))
                    commented_by = user
                    comment = form_comment
                    # we do a query to get the right format for the date
                    self.cursor.execute('''SELECT datetime('now');''')
                    row = self.cursor.fetchone()
                    date = row[0]
        elif subcommand == 'unset':
            if not commented_by:
                self.output.set_result(False, 'Package not commented')
            elif commented_by != user:
                self.output.set_result(False, 'Package commented_by by %s' % commented_by)
            else:
                self.cursor.execute('''DELETE FROM comment WHERE user = ? AND project = ? AND package = ?''', (user, project, package))
                commented_by = None

        pseudorow = {}
        pseudorow['project'] = project
        pseudorow['package'] = package
        if date:
            pseudorow['date'] = date
        if commented_by:
            pseudorow['user'] = commented_by
        if comment:
            pseudorow['comment'] = comment

        return pseudorow


#######################################################################


def handle_args(output, path, form):
    paths = path.split('/', 1)
    if len(paths) == 1:
        command = paths[0]
        args = ''
    else:
        (command, args) = paths

    if form.has_key('version'):
        client_version = form.getfirst('version')
    else:
        client_version = '0.1'

    client_version_items = client_version.split('.')
    for item in client_version_items:
        try:
            int(item)
        except ValueError:
            output.set_result(False, 'Invalid protocol version')
            return

    if len(client_version_items) != 2:
        output.set_result(False, 'Invalid format for protocol version')
        return

    protocol = (int(client_version_items[0]), int(client_version_items[1]))
    if protocol[0] > PROTOCOL_MAJOR or (protocol[0] == PROTOCOL_MAJOR and protocol[1] > PROTOCOL_MINOR):
        output.set_result(False, 'Protocol version requested is unknown')
        return

    # assume the result is successful at first :-)
    output.set_result()

    if not command:
        output.set_result(False, 'No command specified')
    elif command == 'info':
        try:
            info = ApiInfo(output, protocol, args, form)
            info.run()
        except libdbcore.ObsDbException, e:
            output.set_result(False, str(e))
    elif command == 'reserve':
        try:
            reserve = ApiReserve(output, protocol, args, form)
            reserve.run()
        except libdbcore.ObsDbException, e:
            output.set_result(False, str(e))
    elif command == 'comment':
        try:
            comment = ApiComment(output, protocol, args, form)
            comment.run()
        except libdbcore.ObsDbException, e:
            output.set_result(False, str(e))
    else:
        output.set_result(False, 'Unknown command "%s"' % command)


#######################################################################

if os.environ.has_key('PATH_INFO'):
    path = os.environ['PATH_INFO']
    # remove consecutive slashes
    path = re.sub('//+', '/', path)
    # remove first slash
    path = path[1:]
else:
    path = ''

output = ApiOutput()

form = cgi.FieldStorage()
handle_args(output, path, form)

output.output()
07070100000037000081A40000000000000000000000016548EB8C00000000000000000000000000000000000000000000003100000000osc-plugin-collab-0.104+30/server/web/index.html07070100000038000081ED0000000000000000000000016548EB8C00000C3F000000000000000000000000000000000000002F00000000osc-plugin-collab-0.104+30/server/web/index.py#!/usr/bin/env python
# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2008-2010, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#


from libdissector import config
from libdissector import libhttp

if config.cgitb:
    import cgitb; cgitb.enable()


#######################################################################


libhttp.print_html_header()
libhttp.print_header("Our playground for openSUSE")

print '''
<h2>Our playground for openSUSE</h2>

<p>If you're simply interested in browsing the openSUSE source, head to the <a href="https://build.opensuse.org/">openSUSE instance</a> of the <a href="http://open-build-service.org/">Open Build Service</a>! No need to login, just browse the packages; the most interesting projects are the ones starting with "openSUSE:", like <a href="https://build.opensuse.org/project/show?project=openSUSE%3AFactory">openSUSE:Factory</a>.</p>

<p>The original idea behind this playground was to analyze the packages of the GNOME team to know what kind of work is needed, but it has evolved and it's now a good place to get some data about all packages in openSUSE.</a>

<p>This playground also hosts a service that makes it easier to collaborate within the openSUSE community, via <a href="http://en.opensuse.org/openSUSE:Osc_Collab">osc collab</a>. This service works for all packages part of <a href="https://build.opensuse.org/project/show?project=openSUSE%3AFactory">openSUSE:Factory</a>!</p>

<p>On the right side, you can find a few links that are, hopefully, self-explanatory :-)</p>
'''

libhttp.print_foot()
07070100000039000041ED0000000000000000000000026548EB8C00000000000000000000000000000000000000000000003300000000osc-plugin-collab-0.104+30/server/web/libdissector0707010000003A000081A40000000000000000000000016548EB8C0000000E000000000000000000000000000000000000003D00000000osc-plugin-collab-0.104+30/server/web/libdissector/.htaccessDeny from all
0707010000003B000081A40000000000000000000000016548EB8C00000000000000000000000000000000000000000000003F00000000osc-plugin-collab-0.104+30/server/web/libdissector/__init__.py0707010000003C000081A40000000000000000000000016548EB8C00001065000000000000000000000000000000000000004300000000osc-plugin-collab-0.104+30/server/web/libdissector/buildservice.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import urllib2

from cgi import escape
from urllib import urlencode
from urlparse import urlsplit, urlunsplit

try:
    from lxml import etree as ET
except ImportError:
    try:
        from xml.etree import cElementTree as ET
    except ImportError:
        import cElementTree as ET

import config


#######################################################################


class BuildServiceException(Exception):

    def __init__(self, value):
        self.msg = value

    def __str__(self):
        return self.msg


#######################################################################


def get_source_url(project, package, file = None, rev = None, do_escape = False):
    if do_escape:
        project = escape(project)
        package = escape(package)
        if file:
            file = escape(file)

    (scheme, netloc) = urlsplit(config.apiurl)[0:2]
    path = '/'.join(('public', 'source', project, package))
    if file:
        path = '/'.join((path, file))
    if rev:
        query = urlencode({'rev': rev})
    else:
        query = None

    url = urlunsplit((scheme, netloc, path, query, ''))

    return url

def get_source_link(project, package, file = None, rev = None, do_escape = False, text = None, title = None):
    url = get_source_url(project, package, file, rev, do_escape)
    if title:
        title_attr = ' title="%s"' % escape(title)
    else:
        title_attr = ''

    if not text:
        text = file or package
    text = escape(text)

    return '<a href="%s"%s>%s</a>' % (url, title_attr, text)


#######################################################################


def fetch_package_content(project, package):
    url = get_source_url(project, package)
    url += '?expand=1'
    try:
        fd = urllib2.urlopen(url)
        directory = ET.parse(fd).getroot()

        linkinfo = directory.find('linkinfo')
        if linkinfo != None:
            srcmd5 = directory.get('srcmd5')
        else:
            srcmd5 = ''

        files = []
        for node in directory.findall('entry'):
            files.append(node.get('name'))

        fd.close()

        return (files, srcmd5)

    except urllib2.HTTPError, e:
        raise BuildServiceException('Error while fetching the content: %s' % (e.msg,))
    except urllib2.URLError, e:
        raise BuildServiceException('Error while fetching the content: %s' % (e,))
    except SyntaxError, e:
        raise BuildServiceException('Error while fetching the content: %s' % (e.msg,))

    return (None, None)
0707010000003D000081A40000000000000000000000016548EB8C000008C5000000000000000000000000000000000000004000000000osc-plugin-collab-0.104+30/server/web/libdissector/config.py.in# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2009, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

### Settings to change ###

# Where to store files?
## Example: '/tmp/obs-dissector'
datadir = ''
# IPs that are authorized to upload files
## Example: [ '10.0.0.1', '127.0.0.1' ]
upload_authorized_ips = [ ]
# Hosts that are authorized to upload files
## Example: [ 'magic.opensuse.org' ]
upload_authorized_hosts = [ ]

### Settings that should be fine by default ###

# Use cgitb?
cgitb = False
# API URL of the build service to use
apiurl = 'https://api.opensuse.org/'
# Default project to use when no project is specified
default_project = 'openSUSE:Factory'
0707010000003E000081A40000000000000000000000016548EB8C00000F08000000000000000000000000000000000000004000000000osc-plugin-collab-0.104+30/server/web/libdissector/libdbcore.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2008-2010, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os

import sqlite3
import time

from stat import *

import config


#######################################################################


_db_file = os.path.join(config.datadir, 'obs.db')

table_file = 'file'
table_source = 'source'
table_patch = 'patch'
table_rpmlint = 'rpmlint'
table_package = 'package'
table_srcpackage = 'srcpackage'
table_project = 'project'


#######################################################################


pkg_query = 'SELECT %s.* FROM %s, %s WHERE %s.name = ? AND %s.name = ? AND %s.project = %s.id;' % (table_srcpackage, table_project, table_srcpackage, table_project, table_srcpackage, table_srcpackage, table_project)


#######################################################################


def get_db_mtime(raw = False):
    mtime = time.gmtime(os.stat(_db_file)[ST_MTIME])

    if raw:
        return mtime
    else:
        return time.strftime('%d/%m/%Y (%H:%M UTC)', mtime)


#######################################################################


class ObsDbException(Exception):

    def __init__(self, value):
        self.msg = value

    def __str__(self):
        return self.msg


#######################################################################


class ObsDb:

    def __init__(self):
        if not os.path.exists(_db_file):
            raise ObsDbException('Database %s unavailable' % (os.path.abspath(_db_file)))

        self.conn = sqlite3.connect(_db_file)
        if not self.conn:
            raise ObsDbException('Database unavailable')

        self.conn.row_factory = sqlite3.Row
        self.cursor = self.conn.cursor()
        self.cursor.execute('''SELECT * FROM %s;''' % 'db_version')
        row = self.cursor.fetchone()
        if row:
            self.db_major = row['major']
            self.db_minor = row['minor']
        else:
            self.db_major = -1
            self.db_minor = -1
    
    def __del__(self):
        if self.cursor:
            self.cursor.close()
        if self.conn:
            self.conn.close()

    def get_db_version(self):
        return (self.db_major, self.db_minor)

    def cursor_new(self):
        return self.conn.cursor()
0707010000003F000081A40000000000000000000000016548EB8C000009C7000000000000000000000000000000000000004000000000osc-plugin-collab-0.104+30/server/web/libdissector/libdbhtml.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2009-2010, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os
from cgi import escape

import libdbcore

def get_project_selector(current_project = None, db = None):
    if not db:
        db = libdbcore.ObsDb()
    db.cursor.execute('''SELECT name FROM %s ORDER BY UPPER(name);''' % libdbcore.table_project)
    s = ''
    s += '<form action="%s">\n' % escape(os.environ['SCRIPT_NAME'])
    s += '<p>\n'
    s += 'Project:\n'
    s += '<select name="project">\n'
    for row in db.cursor:
        escaped_name = escape(row['name'])
        if row['name'] == current_project:
            selected = ' selected'
        else:
            selected = ''
        s += '<option value="%s"%s>%s</option>\n' % (escaped_name, selected, escaped_name)
    s += '</select>\n'
    s += '<input type="submit" value="choose">\n'
    s += '</p>\n'
    s += '</form>\n'

    return s
07070100000040000081A40000000000000000000000016548EB8C00001E08000000000000000000000000000000000000003E00000000osc-plugin-collab-0.104+30/server/web/libdissector/libhttp.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2008-2010, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

#
# The included HTML code here is the design from the openSUSE project.
# FIXME: find the right copyright/license for it.
#

import config
import libdbcore

def print_text_header():
    print 'Content-type: text/plain'
    print

def print_xml_header():
    print 'Content-type: text/xml'
    print

def print_html_header():
    print 'Content-type: text/html'
    print

def print_header_raw(title):
    print '''<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN" "http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<head>
    <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
    <meta name="MSSmartTagsPreventParsing" content="TRUE" />
    <title>%s</title>
    </head>

<body>
''' % title

def print_foot_raw():
    print '''
</body>
</html>'''

def print_header(title=''):
    print '''<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en" dir="ltr">
 <head>
  <meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
  <meta name="robots" content="index,follow" />

  <link rel="stylesheet" href="http://static.opensuse.org/themes/bento/css/style.css" type="text/css" media="screen" title="All" charset="utf-8" />
  <link rel="stylesheet" href="http://static.opensuse.org/themes/bento/css/print.css" type="text/css" media="print" charset="utf-8">

  <script src="http://static.opensuse.org/themes/bento/js/jquery.js" type="text/javascript" charset="utf-8"></script>
  <script src="http://static.opensuse.org/themes/bento/js/l10n/global-navigation-data-en.js" type="text/javascript" charset="utf-8"></script>
  <script src="http://static.opensuse.org/themes/bento/js/global-navigation.js" type="text/javascript" charset="utf-8"></script>

  <link rel="icon" type="image/png" href="http://static.opensuse.org/themes/bento/images/favicon.png" />
  <title>%s</title>
 </head>

<body>
  <!-- Start: Header -->
  <div id="header">
    <div id="header-content" class="container_12">
      <a id="header-logo" href="./"><img src="http://static.opensuse.org/themes/bento/images/header-logo.png" width="46" height="26" alt="Header Logo" /></a>
      <ul id="global-navigation">
        <li id="item-downloads"><a href="http://en.opensuse.org/openSUSE:Browse#downloads">Downloads</a></li>
        <li id="item-support"><a href="http://en.opensuse.org/openSUSE:Browse#support">Support</a></li>
        <li id="item-community"><a href="http://en.opensuse.org/openSUSE:Browse#community">Community</a></li>
        <li id="item-development"><a href="http://en.opensuse.org/openSUSE:Browse#development">Development</a></li>
      </ul>
    </div>
  </div>
  <!-- End: Header -->

  <!-- Start: Main Content Area -->
  <div id="content" class="container_16 content-wrapper">

    <div class="box box-shadow grid_12 alpha">

        <!-- Begin Content Area -->
''' % title

def print_foot(additional_box = ''):
    timestr = libdbcore.get_db_mtime()
    print '''
      <!-- End Content Area -->
    </div>

    <div class="column grid_4 omega">

      <div id="some_other_content" class=" box box-shadow alpha clear-both navigation">
        <h2 class="box-header">Navigation</h2>
          <ul class="navigation">
            <li><a href="./obs">Packages Status</a></li>
            <li><a href="./patch">Patches Status</a></li>
            <!--<li><a href="./rpmlint">Rpmlint Status</a></li>-->
          </ul>
      </div>

%s

    </div>

  </div>

  <!-- Start: included footer part -->
  <div id="footer" class="container_12">
    <!-- TODO: add content -->
    <div id="footer-legal" class="border-top grid_12">
      <p>
        This is still a prototype and is not officially endorsed by the openSUSE project.
        <br />
        Database last updated on %s
      </p>
    </div>
  </div>
 </body>
</html>''' % (additional_box, timestr)

# At some point we wanted to have this too:

'''
       <div class="green_box">
        <div class="box_top_row">
         <div class="box_left"></div>
         <div class="box_right"></div>
        </div>
        <div class="box_title_row">
         <div class="box_title">
          Statistics
         </div>
        </div>
        <div class="box_content">
         <ul class="navlist">
          <li>General stats</li>
          <li>Graphes</li>
         </ul>
        </div>
        <div class="box_bottom_row">
         <div class="box_left"></div>
         <div class="box_right"></div>
        </div>
       </div>
       <br />
'''

'''
       <div class="green_box">
        <div class="box_top_row">
         <div class="box_left"></div>
         <div class="box_right"></div>
        </div>
        <div class="box_title_row">
         <div class="box_title">
          Reports
         </div>
        </div>
        <div class="box_content">
         <ul class="navlist">
          <li>Build Failures</li>
          <li>Tagging Progress</li>
          <li>Rpmlint Progress</li>
          <li>Bug Filing Status</li>
          <li>Patch Rebase Status</li>
          <li>Patch SLE Status</li>
         </ul>
        </div>
        <div class="box_bottom_row">
         <div class="box_left"></div>
         <div class="box_right"></div>
        </div>
       </div>
       <br />
'''

def get_arg(form, name, default_value = None):
    if form.has_key(name):
        return form.getfirst(name)
    else:
        return default_value

def get_arg_bool(form, name, default_value = False):
    if default_value:
        default = '1'
    else:
        default = '0'

    value = get_arg(form, name, default)
    try:
        return (int(value) == 1)
    except ValueError:
        return default_value

def get_project(form):
    return get_arg(form, 'project', config.default_project)

def get_srcpackage(form):
    ret = get_arg(form, 'srcpackage')
    if not ret:
        ret = get_arg(form, 'package')
    return ret
07070100000041000081A40000000000000000000000016548EB8C0000305D000000000000000000000000000000000000004100000000osc-plugin-collab-0.104+30/server/web/libdissector/libinfoxml.py# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2009-2010, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#

import os
import sys

import sqlite3
import tempfile

try:
    from lxml import etree as ET
except ImportError:
    try:
        from xml.etree import cElementTree as ET
    except ImportError:
        import cElementTree as ET

import config
import libdbcore

# Directory containing XML caches for projects data
XML_CACHE_DIR = os.path.join(config.datadir, 'xml')


#######################################################################


class InfoXmlException(Exception):

    def __init__(self, value):
        self.msg = value

    def __str__(self):
        return self.msg


#######################################################################


class InfoXml:

    version_query = 'SELECT %s.version FROM %s, %s WHERE %s.name = ? AND %s.name = ? AND %s.project = %s.id ;' % (libdbcore.table_srcpackage, libdbcore.table_project, libdbcore.table_srcpackage, libdbcore.table_project, libdbcore.table_srcpackage, libdbcore.table_srcpackage, libdbcore.table_project)

    def __init__(self, obsdb = None):
        if not obsdb:
            self.obsdb = libdbcore.ObsDb()
        else:
            if not isinstance(obsdb, libdbcore.ObsDb):
                raise TypeError, 'obsdb must be a ObsDb instance'
            self.obsdb = obsdb

        self.cursor = self.obsdb.cursor_new()
        self.cursor_helper = self.obsdb.cursor_new()

        self.cache_dir = XML_CACHE_DIR
        self.version_cache = None

    def _find_version_for_sql(self, project, package):
        self.cursor_helper.execute(self.version_query, (project, package))
        row = self.cursor_helper.fetchone()
        if row:
            return row['version']
        else:
            return None

    def _find_version_for(self, project, package):
        # We have a cache here because we want to avoid doing SQL queries.
        # See also comment in create_cache()
        if self.version_cache is None:
            return self._find_version_for_sql(project, package)

        try:
            return self.version_cache[project][package]
        except KeyError:
            return None

    def _get_package_node_from_row(self, row, ignore_upstream, default_parent_project):
        name = row['name']
        version = row['version']
        link_project = row['link_project']
        link_package = row['link_package']
        devel_project = row['devel_project']
        devel_package = row['devel_package']
        upstream_version = row['upstream_version']
        upstream_url = row['upstream_url']
        is_link = row['is_obs_link']
        has_delta = row['obs_link_has_delta']
        error = row['obs_error']
        error_details = row['obs_error_details']

        parent_version = None
        devel_version = None

        package = ET.Element('package')
        package.set('name', name)

        if link_project:
            if (link_project != default_parent_project) or (link_package and link_package != name):
                node = ET.SubElement(package, 'parent')
                node.set('project', link_project)
                if link_package and link_package != name:
                    node.set('package', link_package)
            parent_version = self._find_version_for(link_project, link_package or name)
        elif default_parent_project:
            parent_version = self._find_version_for(default_parent_project, name)

        if devel_project:
            node = ET.SubElement(package, 'devel')
            node.set('project', devel_project)
            if devel_package and devel_package != name:
                node.set('package', devel_package)
            devel_version = self._find_version_for(devel_project, devel_package or name)

        if version or upstream_version or parent_version or devel_version:
            node = ET.SubElement(package, 'version')
            if version:
                node.set('current', version)
            if upstream_version:
                node.set('upstream', upstream_version)
            if parent_version:
                node.set('parent', parent_version)
            if devel_version:
                node.set('devel', devel_version)

        if upstream_url:
            upstream = ET.SubElement(package, 'upstream')
            if upstream_url:
                node = ET.SubElement(upstream, 'url')
                node.text = upstream_url

        if is_link:
            node = ET.SubElement(package, 'link')
            if has_delta:
                node.set('delta', 'true')
            else:
                node.set('delta', 'false')
        # deep delta (ie, delta in non-link packages)
        elif has_delta:
            node = ET.SubElement(package, 'delta')

        if error:
            node = ET.SubElement(package, 'error')
            node.set('type', error)
            if error_details:
                node.text = error_details

        return package

    def get_package_node(self, project, package):
        self.cursor.execute(libdbcore.pkg_query, (project, package))
        row = self.cursor.fetchone()
        
        if not row:
            raise InfoXmlException('Non existing package in project %s: %s' % (project, package))

        self.cursor_helper.execute('''SELECT * FROM %s WHERE name = ?;''' % libdbcore.table_project, (project,))

        row_helper = self.cursor_helper.fetchone()
        parent_project = row_helper['parent']
        ignore_upstream = row_helper['ignore_upstream']

        return self._get_package_node_from_row(row, ignore_upstream, parent_project)

    def get_project_node(self, project, filled = True, write_cache = False):
        if filled:
            prj_node = self._read_cache(project)
            if prj_node is not None:
                return prj_node

        self.cursor.execute('''SELECT * FROM %s WHERE name = ?;''' % libdbcore.table_project, (project,))
        row = self.cursor.fetchone()

        if not row:
            raise InfoXmlException('Non existing project: %s' % project)

        project_id = row['id']
        parent_project = row['parent']
        ignore_upstream = row['ignore_upstream']

        prj_node = ET.Element('project')
        prj_node.set('name', project)
        if parent_project:
            prj_node.set('parent', parent_project)
        if ignore_upstream:
            prj_node.set('ignore_upstream', 'true')

        if not filled:
            return prj_node

        should_exist = {}
        self.cursor.execute('''SELECT A.name AS parent_project, B.name AS parent_package, B.devel_package FROM %s AS A, %s AS B WHERE A.id = B.project AND devel_project = ? ORDER BY A.name, B.name;''' % (libdbcore.table_project, libdbcore.table_srcpackage), (project,))
        for row in self.cursor:
            should_parent_project = row['parent_project']
            should_parent_package = row['parent_package']
            should_devel_package = row['devel_package'] or should_parent_package
            should_exist[should_devel_package] = (should_parent_project, should_parent_package)

        self.cursor.execute('''SELECT * FROM %s WHERE project = ? ORDER BY name;''' % libdbcore.table_srcpackage, (project_id,))
        for row in self.cursor:
            pkg_node = self._get_package_node_from_row(row, ignore_upstream, parent_project)
            prj_node.append(pkg_node)
            try:
                del should_exist[row['name']]
            except KeyError:
                pass

        if len(should_exist) > 0:
            missing_node = ET.Element('missing')
            for (should_package_name, (should_parent_project, should_parent_package)) in should_exist.iteritems():
                missing_pkg_node = ET.Element('package')

                missing_pkg_node.set('name', should_package_name)
                missing_pkg_node.set('parent_project', should_parent_project)
                if should_package_name != should_parent_package:
                    missing_pkg_node.set('parent_package', should_parent_package)

                missing_node.append(missing_pkg_node)

            prj_node.append(missing_node)

        if write_cache:
            self._write_cache(project, prj_node)

        return prj_node

    def _get_cache_path(self, project):
        return os.path.join(self.cache_dir, project + '.xml')

    def _read_cache(self, project):
        cache = self._get_cache_path(project)

        try:
            if os.path.exists(cache):
                return ET.parse(cache).getroot()
        except:
            pass

        return None

    def _write_cache(self, project, node):
        cache = self._get_cache_path(project)

        try:
            if os.path.exists(cache):
                return

            dirname = os.path.dirname(cache)
            if not os.path.exists(dirname):
                os.makedirs(dirname)

            if not os.access(dirname, os.W_OK):
                return

            tree = ET.ElementTree(node)
            tree.write(cache)
        except:
            pass

    def create_cache(self, verbose = False):
        try:
            if not os.path.exists(self.cache_dir):
                os.makedirs(self.cache_dir)
        except Exception, e:
            raise InfoXmlException('Cannot create cache directory (%s).' % e)

        if not os.access(self.cache_dir, os.W_OK):
            raise InfoXmlException('No write access.')

        self.cursor.execute('''SELECT name FROM %s;''' % libdbcore.table_project)
        # We need to first take all names because cursor will be re-used
        projects = [ row['name'] for row in self.cursor ]

        # Create the cache containing version of all packages. This will help
        # us avoid doing many small SQL queries, which is really slow.
        #
        # The main difference is that we do one SQL query + many hash accesses,
        # vs 2*(total number of packages in the database) SQL queries. On a
        # test run, the difference results in ~1min15s vs ~5s. That's a 15x
        # time win.
        self.version_cache = {}
        for project in projects:
            self.version_cache[project] = {}
        self.cursor.execute('''SELECT A.name, A.version, B.name AS project FROM %s AS A, %s AS B WHERE A.project = B.id;''' % (libdbcore.table_srcpackage, libdbcore.table_project))
        for row in self.cursor:
            self.version_cache[row['project']][row['name']] = row['version']

        for project in projects:
            self.get_project_node(project, write_cache = True)
            if verbose:
                print 'Wrote cache for %s.' % project

#if __name__ == '__main__':
#    try:
#        info = InfoXml()
#        info.cache_dir = XML_CACHE_DIR + '-test'
#        info.create_cache()
#    except KeyboardInterrupt:
#        pass
07070100000042000081ED0000000000000000000000016548EB8C00001BD7000000000000000000000000000000000000003400000000osc-plugin-collab-0.104+30/server/web/obs-upload.py#!/usr/bin/env python
# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2008-2010, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#


import os
import sys

import base64
import cgi
import socket
import shutil

from libdissector import config
from libdissector import libinfoxml

# Upload with:
# # Upload the db
# curl --silent --show-error -F dbfile=@/path/to/obs.db http://server/path/obs-upload.py

UPLOAD_DIR = config.datadir
AUTHORIZED_IPS = config.upload_authorized_ips
AUTHORIZED_HOSTS = config.upload_authorized_hosts

def log_error (s):
    print >>sys.stderr, 'obs-upload: %s' % s,

def save_uploaded_file_internal (filename, fileitem, tmppath, destpath):
    fout = file (tmppath, 'wb')

    size = 0
    complete = False

    while 1:
        chunk = fileitem.file.read(100000)
        if not chunk:
            complete = True
            break
        size = size + len(chunk)
        # if bigger than 15MB, we give up. This way, it's not possible to fill
        # the disk
        # FWIW: file size was 2683904 on 2009-03-28
        # file size was 12480512 on 2009-07-25
        # file size was 10097664 on 2009-08-28
        # file size was 9393152 on 2009-08-31
        if size > 1024*1024*15:
            break
        fout.write (chunk)
    fout.close()

    if not complete:
        print 'File upload cancelled: file is too big'
        log_error ('File upload cancelled: file is too big')
        return False

    if filename == 'obs.db':
        if size < 1024*1024*8:
            print 'File upload cancelled: file is not as expected'
            log_error ('File upload cancelled: obs.db too small (%d)' % size)
            return False

    try:
        os.rename(tmppath, destpath)
        return True
    except:
        print 'File upload cancelled: cannot rename file'
        log_error ('File upload cancelled: cannot rename file')
        return False

def save_uploaded_file (form, form_field, upload_dir, filename):
    if not form.has_key(form_field):
        return False

    fileitem = form[form_field]
    if not fileitem.file:
        return False

    # force the filename where we save, so we're not remote exploitable
    tmppath = os.path.join(upload_dir, filename + '.tmp')
    destpath = os.path.join(upload_dir, filename)

    try:
        if not os.path.exists(upload_dir):
            os.makedirs(upload_dir)

        ret = save_uploaded_file_internal (filename, fileitem, tmppath, destpath)
    except Exception, e:
        print 'Unknown error'
        log_error ('Unknown error: %s' % str(e))
        ret = False

    if os.path.exists(tmppath):
        os.unlink(tmppath)

    return ret

def create_cache(filename):
    if filename != 'obs.db':
        return

    try:
        info = libinfoxml.InfoXml()
    except Exception, e:
        print 'Unknown error when accessing the database'
        log_error ('Unknown error when accessing the database: %s' % str(e))
        return

    if os.path.exists(info.cache_dir) and not os.access(info.cache_dir, os.W_OK):
        print 'Cannot verify database: no access'
        log_error ('Cannot verify database: no access')
        return

    # We'll first write to a temporary directory since it's a long operation
    # and we don't want to make data unavailable
    cache_dir = info.cache_dir
    tmp_cache_dir = info.cache_dir + '.tmp'
    bak_cache_dir = info.cache_dir + '.bak'
    info.cache_dir = tmp_cache_dir

    # Remove this check: worst case, we'll have data about a project that
    # doesn't exist anymore or we'll overwrite a cache file that was just
    # created. In both cases, it's not a big deal -- especially since this
    # shouldn't stay long in time.
    ## This is racy (check for existence and then creation), but it should be
    ## okay in the end since there is only one client
    #if os.path.exists(info.cache_dir):
    #    print 'Cannot verify database: already verifying'
    #    return

    try:
        info.create_cache()

        # First move the old cache away before installing the new one (fast
        # operation), and then nuke the old cache
        if os.path.exists(bak_cache_dir):
            shutil.rmtree(bak_cache_dir)
        if os.path.exists(cache_dir):
            os.rename(cache_dir, bak_cache_dir)
        os.rename(tmp_cache_dir, cache_dir)
        if os.path.exists(bak_cache_dir):
            shutil.rmtree(bak_cache_dir)
    except Exception, e:
        print 'Cannot verify database'
        log_error ('Cannot verify database: no access (%s)' % str(e))
        try:
            if os.path.exists(tmp_cache_dir):
                shutil.rmtree(tmp_cache_dir)
            if os.path.exists(bak_cache_dir):
                if not os.path.exists(cache_dir):
                    os.rename(bak_cache_dir, cache_dir)
                else:
                    shutil.rmtree(bak_cache_dir)
        except:
            pass

print 'content-type: text/html\n'

form = cgi.FieldStorage()
if form.has_key('destfile'):
    dest = form.getfirst('destfile')
    if not dest in ['obs.db']:
        print 'Unknown file'
        sys.exit(0)
else:
    # Just assume it's the standard database
    dest = 'obs.db'

authorized_ips = AUTHORIZED_IPS[:]
for host in AUTHORIZED_HOSTS:
    try:
        ip = socket.gethostbyname(host)
        authorized_ips.append(ip)
    except:
        pass

if os.environ['REMOTE_ADDR'] in authorized_ips:
    ret = save_uploaded_file (form, 'dbfile', UPLOAD_DIR, dest)
    if ret and dest in ['obs.db']:
        create_cache(dest)
else:
    print 'Unauthorized access'
07070100000043000081ED0000000000000000000000016548EB8C0000274E000000000000000000000000000000000000002D00000000osc-plugin-collab-0.104+30/server/web/obs.py#!/usr/bin/env python
# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2008-2010, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#


import os
import sys

import cgi
from cgi import escape

from libdissector import config
from libdissector import libdbhtml
from libdissector import libhttp
from libdissector import libinfoxml

if config.cgitb:
    import cgitb; cgitb.enable()


#######################################################################


def compare_versions_a_gt_b (a, b):
    split_a = a.split('.')
    split_b = b.split('.')
    if len(split_a) != len(split_b):
        return a > b
    for i in range(len(split_a)):
        try:
            int_a = int(split_a[i])
            int_b = int(split_b[i])
            if int_a > int_b:
                return True
            if int_b > int_a:
                return False
        except ValueError:
            if split_a[i] > split_b[i]:
                return True
            if split_b[i] > split_a[i]:
                return False

    return False


#######################################################################


def colortype_to_style(colortype):
    if colortype is None:
        return ''

    colors = {
        'not-in-parent': ('#75507b', 'white'),
        'delta':         ('#3465a4', 'white'),
        'no-upstream':   ('#fce94f', None),
        'new-upstream':  ('#a40000', 'white')
    }

    (bg, text) = colors[colortype]

    if bg or text:
        style = ' style="'
        if bg:
            style += 'background: %s;' % bg
        if text:
            style += 'color: %s;' % text
        style += '"'
    else:
        style = ''

    return style


#######################################################################


def get_legend_box():
    s = ''
    s += '<div id="some_other_content" class="box box-shadow alpha clear-both">\n'
    s += '<h2 class="box-header">Legend</h2>\n'
    s += '<table>\n'
    s += '<tr><td>Package is perfect!</td></tr>\n'
    s += '<tr><td%s>Does not exist in parent</td></tr>\n' % colortype_to_style('not-in-parent')
    s += '<tr><td%s>Has delta with parent</td></tr>\n' % colortype_to_style('delta')
    s += '<tr><td%s>No upstream data</td></tr>\n' % colortype_to_style('no-upstream')
    s += '<tr><td%s>Upstream has a new version</td></tr>\n' % colortype_to_style('new-upstream')
    s += '</table>\n'
    s += '</div>\n'

    return s


#######################################################################


class Package:

    def __init__(self, node):
        self.name = None
        self.parent_project = None
        self.version = None
        self.upstream_version = None
        self.parent_version = None
        self.upstream_url = None
        self.has_delta = False
        self.error = None
        self.error_details = None

        if node is not None:
            self.name = node.get('name')

            parent = node.find('parent')
            if parent is not None:
                self.parent_project = parent.get('project')

            version = node.find('version')
            if version is not None:
                self.version = version.get('current')
                self.upstream_version = version.get('upstream')
                self.parent_version = version.get('parent')

            upstream = node.find('upstream')
            if upstream is not None:
                url = upstream.find('url')
                if url is not None:
                    self.upstream_url = url.text

            link = node.find('link')
            if link is not None:
                if link.get('delta') == 'true':
                    self.has_delta = True

            delta = node.find('delta')
            if delta is not None:
                self.has_delta = True

            error = node.find('error')
            if error is not None:
                self.error = error.get('type')
                self.error_details = error.text or ''
            else:
                self.error = ''
                self.error_details = ''

            if not self.version:
                self.version = ''
            if not self.upstream_version:
                self.upstream_version = ''
            if not self.parent_version:
                self.parent_version = '--'
            if not self.upstream_url:
                self.upstream_url = ''

#######################################################################


def get_colortype(package, parent, use_upstream):
    color = None

    if parent and package.has_delta:
        color = 'delta'
    elif parent and package.parent_version == '--':
        color = 'not-in-parent'

    if use_upstream:
        if package.upstream_version not in [ '', '--' ]:
            newer_than_parent = package.parent_version == '--' or compare_versions_a_gt_b(package.upstream_version, package.parent_version)
            newer = compare_versions_a_gt_b(package.upstream_version, package.version)
            if newer and newer_than_parent:
                color = 'new-upstream'

        elif color is None and package.upstream_version != '--':
            color = 'no-upstream'

    return color


#######################################################################


def get_table_for_project(project, only_missing_upstream, only_missing_parent):
    info = libinfoxml.InfoXml()
    try:
        node = info.get_project_node(project)
    except libinfoxml.InfoXmlException, e:
        return 'Error: %s' % e.msg

    parent = node.get('parent')
    use_upstream = node.get('ignore_upstream') != 'true'

    packages = []
    for subnode in node.findall('package'):
        package = Package(subnode)

        if only_missing_upstream and use_upstream:
            if package.upstream_url:
                continue
            if package.upstream_version == '--':
                continue
        elif only_missing_parent and parent:
            if package.parent_version != '--':
                continue

        packages.append(package)

    if len(packages) == 0:
        return 'No package in %s.' % escape(project)

    if parent:
        same_parent = True
        for package in packages:
            if package.parent_project and package.parent_project != project and package.parent_project != parent:
                same_parent = False
                break

    s = ''
    s += '<h2>%s source packages in %s</h2>\n' % (len(packages), escape(project))
    s += '<table>\n'

    s += '<tr>\n'
    s += '<th>Package</th>\n'
    if parent:
        if same_parent:
            s += '<th>%s</th>\n' % escape(parent)
        else:
            s += '<th>%s</th>\n' % 'Parent project'

    s += '<th>%s</th>\n' % escape(project)
    if use_upstream:
        s += '<th>Upstream</th>\n'
    s += '</tr>\n'

    for package in packages:
        colortype = get_colortype(package, parent, use_upstream)
        style = colortype_to_style(colortype)

        row = '<tr><td%s>%s</td>' % (style, escape(package.name))
        if parent:
            row += '<td>%s</td>' % (escape(package.parent_version),)

        if package.error in ['not-in-parent', 'need-merge-with-parent']:
            if package.error_details:
                row += '<td title="%s">%s</td>' % (escape(package.error_details), '(broken)')
            else:
                row += '<td title="%s">%s</td>' % (escape(package.error), '(broken)')
        else:
            row += '<td>%s</td>' % escape(package.version)

        if use_upstream:
            if package.upstream_url and package.upstream_url != '':
                version_cell = '<a href="' + escape(package.upstream_url) + '">' + escape(package.upstream_version) + '</a>'
            else:
                version_cell = escape(package.upstream_version)
            row += '<td>%s</td>' % version_cell
        row += '</tr>'

        s += row
        s += '\n'

    s += '</table>\n'

    return s


#######################################################################


form = cgi.FieldStorage()

only_missing_upstream = libhttp.get_arg_bool(form, 'missing-upstream', False)
only_missing_parent = libhttp.get_arg_bool(form, 'missing-parent', False)

libhttp.print_html_header()

project = libhttp.get_project(form)
table = get_table_for_project(project, only_missing_upstream, only_missing_parent)

libhttp.print_header('Versions of packages in the Build Service for project %s' % escape(project))

print libdbhtml.get_project_selector(current_project = project)
print table

libhttp.print_foot(additional_box = get_legend_box())
07070100000044000081ED0000000000000000000000016548EB8C00001FD5000000000000000000000000000000000000002F00000000osc-plugin-collab-0.104+30/server/web/patch.py#!/usr/bin/env python
# vim: set ts=4 sw=4 et: coding=UTF-8

#
# Copyright (c) 2008-2010, Novell, Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
#  * Redistributions of source code must retain the above copyright notice,
#    this list of conditions and the following disclaimer.
#  * Redistributions in binary form must reproduce the above copyright notice,
#    this list of conditions and the following disclaimer in the documentation
#    and/or other materials provided with the distribution.
#  * Neither the name of the <ORGANIZATION> nor the names of its contributors
#    may be used to endorse or promote products derived from this software
#    without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
#
# (Licensed under the simplified BSD license)
#
# Authors: Vincent Untz <[email protected]>
#


import os
import sys

import cgi
from cgi import escape

from libdissector import buildservice
from libdissector import config
from libdissector import libdbcore
from libdissector import libdbhtml
from libdissector import libhttp

if config.cgitb:
    import cgitb; cgitb.enable()


#######################################################################


def get_page_title(project, srcpackage, tag):
    if project and srcpackage and tag:
        return 'Patches tagged %s for package %s in project %s' % (escape(tag), escape(srcpackage), escape(project))
    elif project and srcpackage:
        return 'Patches for package %s in project %s' % (escape(srcpackage), escape(project))
    elif project and tag:
        return 'Patches tagged %s in project %s' % (escape(tag), escape(project))
    elif project:
        return 'Patches in project %s' % (escape(project))
    else:
        return 'Patches'


#######################################################################


def get_package(db, project, srcpackage, tag):
    db.cursor.execute(libdbcore.pkg_query, (project, srcpackage))
    row = db.cursor.fetchone()
    
    if not row:
        return 'Error: package %s does not exist in project %s' % (escape(project), escape(srcpackage))

    if row['is_obs_link'] and row['srcmd5']:
        rev = row['srcmd5']
    else:
        rev = None

    if tag:
        db.cursor.execute('''SELECT * FROM %s WHERE srcpackage = ? AND tag = ? ORDER BY nb_in_pack;''' % libdbcore.table_patch, (row['id'], tag))
    else:
        db.cursor.execute('''SELECT * FROM %s WHERE srcpackage = ? ORDER BY nb_in_pack;''' % libdbcore.table_patch, (row['id'],))

    s = ''
    s += '<pre>\n'

    count = 0
    for row in db.cursor:
        count += 1
        url = buildservice.get_source_url(project, srcpackage, row['filename'], rev, True)
        s += '%s: <a href=\"%s\">%s</a>' % (row['nb_in_pack'], url, row['filename'])
        if row['disabled'] != 0:
            s += ' (not applied)'
        s += '\n'

    s += '</pre>\n'

    if tag:
        s = '<h2>%d patches tagged %s for package %s in project %s</h2>\n' % (count, escape(tag), escape(srcpackage), escape(project)) + s
    else:
        s = '<h2>%d patches for package %s in project %s</h2>\n' % (count, escape(srcpackage), escape(project)) + s

    return s


#######################################################################


def get_project(db, project, tag):
    db.cursor.execute('''SELECT id FROM %s WHERE name = ?;''' % libdbcore.table_project, (project,))
    row = db.cursor.fetchone()
    
    if not row:
        return 'Error: project %s does not exist' % escape(project)

    project_id = row['id']

    if tag == 'None':
        tag_sql = ''
    else:
        tag_sql = tag

    if tag:
        db.cursor.execute('''SELECT COUNT(*) FROM %s, %s WHERE %s.srcpackage = %s.id AND %s.project = ? AND tag = ?;''' % (libdbcore.table_patch, libdbcore.table_srcpackage, libdbcore.table_patch, libdbcore.table_srcpackage, libdbcore.table_srcpackage) , (project_id, tag_sql))
    else:
        db.cursor.execute('''SELECT COUNT(*) FROM %s, %s WHERE %s.srcpackage = %s.id AND %s.project = ?;''' % (libdbcore.table_patch, libdbcore.table_srcpackage, libdbcore.table_patch, libdbcore.table_srcpackage, libdbcore.table_srcpackage) , (project_id,))
    
    row = db.cursor.fetchone()
    count = escape(str(row[0]))

    s = ''

    if tag:
        s += '<h2>%s patches tagged %s in project %s</h2>\n<p>\n' % (count, escape(tag), escape(project))

        db.cursor.execute('''SELECT COUNT(*) AS c, %s.name AS n FROM %s, %s WHERE %s.srcpackage = %s.id AND %s.project = ? AND tag = ? GROUP BY srcpackage ORDER BY c DESC;''' % (libdbcore.table_srcpackage, libdbcore.table_patch, libdbcore.table_srcpackage, libdbcore.table_patch, libdbcore.table_srcpackage, libdbcore.table_srcpackage), (project_id, tag_sql))
        for row in db.cursor:
            s += '<a href="%s?project=%s&amp;srcpackage=%s&amp;tag=%s">%s</a>: %s<br />\n' % (escape(os.environ['SCRIPT_NAME']), escape(project), escape(row['n']), escape(tag or ''), escape(row['n']), escape(str(row['c'])))

        s += '</p>\n'

    else:
        s += '<h2>%s patches in project %s</h2>\n' % (count, escape(project))

        s += '<h3>Order by tag</h3>\n<p>\n'

        db.cursor.execute('''SELECT COUNT(*) AS c, tag FROM %s, %s WHERE %s.srcpackage = %s.id AND %s.project = ? GROUP BY tag ORDER BY c DESC;''' % (libdbcore.table_patch, libdbcore.table_srcpackage, libdbcore.table_patch, libdbcore.table_srcpackage, libdbcore.table_srcpackage), (project_id,))

        for row in db.cursor:
            if row['tag'] == '':
                row_tag = 'None'
            else:
                row_tag = escape(row['tag'])

            s += '<a href="%s?project=%s&amp;tag=%s">%s</a>: %s<br />\n' % (escape(os.environ['SCRIPT_NAME']), escape(project), row_tag, row_tag, escape(str(row['c'])))

        s += '</p>\n<h3>Order by source package</h3>\n<p>\n'

        db.cursor.execute('''SELECT COUNT(*) AS c, %s.name AS n FROM %s, %s WHERE %s.srcpackage = %s.id AND %s.project = ? GROUP BY srcpackage ORDER BY c DESC;''' % (libdbcore.table_srcpackage, libdbcore.table_patch, libdbcore.table_srcpackage, libdbcore.table_patch, libdbcore.table_srcpackage, libdbcore.table_srcpackage), (project_id,))
        for row in db.cursor:
            s += '<a href="%s?project=%s&amp;srcpackage=%s">%s</a>: %s<br />\n' % (escape(os.environ['SCRIPT_NAME']), escape(project), escape(row['n']), escape(row['n']), escape(str(row['c'])))

        s += '</p>\n'

    return s


#######################################################################


def get_page_content(db, project, srcpackage, tag):
    if not project:
        return 'Error: no project specified'

    if srcpackage:
        return get_package(db, project, srcpackage, tag)
    else:
        return get_project(db, project, tag)


#######################################################################


form = cgi.FieldStorage()

libhttp.print_html_header()

project = libhttp.get_project(form)
srcpackage = libhttp.get_srcpackage(form)
tag = libhttp.get_arg(form, 'tag')

db = libdbcore.ObsDb()

title = get_page_title(project, srcpackage, tag)
content = get_page_content(db, project, srcpackage, tag)

libhttp.print_header(title)

if not srcpackage:
    print libdbhtml.get_project_selector(current_project = project, db = db)
print content

libhttp.print_foot()
07070100000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000B00000000TRAILER!!!1369 blocks
openSUSE Build Service is sponsored by
OSZAR »